mirror of
				https://github.com/hpcaitech/ColossalAI.git
				synced 2025-11-04 07:58:42 +00:00 
			
		
		
		
	* [example] add train vit with booster example * [example] update readme * [example] add train resnet with booster example * [example] enable ci * [example] enable ci * [example] add requirements * [hotfix] fix analyzer init * [example] update requirements
		
			
				
	
	
	
		
			1.3 KiB
		
	
	
	
	
	
	
	
			
		
		
	
	
			1.3 KiB
		
	
	
	
	
	
	
	
Finetune BERT on GLUE
🚀 Quick Start
This example provides a training script, which provides an example of finetuning BERT on GLUE dataset.
- Training Arguments
-t,--task: GLUE task to run. Defaults tomrpc.-p,--plugin: Plugin to use. Choices:torch_ddp,torch_ddp_fp16,gemini,low_level_zero. Defaults totorch_ddp.--target_f1: Target f1 score. Raise exception if not reached. Defaults toNone.
 
Install requirements
pip install -r requirements.txt
Train
# train with torch DDP with fp32
colossalai run --nproc_per_node 4 finetune.py
# train with torch DDP with mixed precision training
colossalai run --nproc_per_node 4 finetune.py -p torch_ddp_fp16
# train with gemini
colossalai run --nproc_per_node 4 finetune.py -p gemini
# train with low level zero
colossalai run --nproc_per_node 4 finetune.py -p low_level_zero
Expected F1-score will be:
| Model | Single-GPU Baseline FP32 | Booster DDP with FP32 | Booster DDP with FP16 | Booster Gemini | Booster Low Level Zero | 
|---|---|---|---|---|---|
| bert-base-uncased | 0.86 | 0.88 | 0.87 | 0.88 | 0.89 |