mirror of
https://github.com/hpcaitech/ColossalAI.git
synced 2025-09-02 01:28:31 +00:00
Add GPT PP Example (#2272)
Co-authored-by: Ziyue Jiang <ziyue.jiang@gmail.com>
This commit is contained in:
@@ -44,6 +44,11 @@ For simplicity, the input data is randonly generated here.
|
||||
bash run.sh
|
||||
```
|
||||
|
||||
Pipeline Parallel
|
||||
```bash
|
||||
bash run_pp.sh
|
||||
```
|
||||
|
||||
### Training config
|
||||
|
||||
The `train_gpt_demo.py` provides three distributed plans, you can choose the plan you want in `run.sh`. The Colossal-AI leverages Tensor Parallel and Gemini + ZeRO DDP.
|
||||
|
Reference in New Issue
Block a user