ColossalAI/examples/tutorial/large_batch_optimizer/config.py
Frank Lee ac18a445fa
[example] updated large-batch optimizer tutorial (#2448)
* [example] updated large-batch optimizer tutorial

* polish code

* polish code
2023-01-11 16:27:31 +08:00

17 lines
314 B
Python

from colossalai.amp import AMP_TYPE
# hyperparameters
# BATCH_SIZE is as per GPU
# global batch size = BATCH_SIZE x data parallel size
BATCH_SIZE = 512
LEARNING_RATE = 3e-3
WEIGHT_DECAY = 0.3
NUM_EPOCHS = 2
WARMUP_EPOCHS = 1
# model config
NUM_CLASSES = 10
fp16 = dict(mode=AMP_TYPE.NAIVE)
clip_grad_norm = 1.0