mirror of
https://github.com/hpcaitech/ColossalAI.git
synced 2025-10-24 17:33:39 +00:00
* [misc] remove config arg from initialize * [misc] remove old tensor contrusctor * [plugin] add npu support for ddp * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * [devops] fix doc test ci * [test] fix test launch * [doc] update launch doc --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Finetune BERT on GLUE
🚀 Quick Start
This example provides a training script, which provides an example of finetuning BERT on GLUE dataset.
- Training Arguments
-t,--task: GLUE task to run. Defaults tomrpc.-p,--plugin: Plugin to use. Choices:torch_ddp,torch_ddp_fp16,gemini,low_level_zero. Defaults totorch_ddp.--target_f1: Target f1 score. Raise exception if not reached. Defaults toNone.
Install requirements
pip install -r requirements.txt
Train
# train with torch DDP with fp32
colossalai run --nproc_per_node 4 finetune.py
# train with torch DDP with mixed precision training
colossalai run --nproc_per_node 4 finetune.py -p torch_ddp_fp16
# train with gemini
colossalai run --nproc_per_node 4 finetune.py -p gemini
# train with low level zero
colossalai run --nproc_per_node 4 finetune.py -p low_level_zero
Expected F1-score will be:
| Model | Single-GPU Baseline FP32 | Booster DDP with FP32 | Booster DDP with FP16 | Booster Gemini | Booster Low Level Zero |
|---|---|---|---|---|---|
| bert-base-uncased | 0.86 | 0.88 | 0.87 | 0.88 | 0.89 |