mirror of
https://github.com/hpcaitech/ColossalAI.git
synced 2025-09-01 17:17:05 +00:00
Support TP-compatible Torch AMP and Update trainer API (#27)
* Add gradient accumulation, fix lr scheduler
* fix FP16 optimizer and adapted torch amp with tensor parallel (#18)
* fixed bugs in compatibility between torch amp and tensor parallel and performed some minor fixes
* fixed trainer
* Revert "fixed trainer"
This reverts commit 2e0b0b7699
.
* improved consistency between trainer, engine and schedule (#23)
Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: 1SAA <c2h214748@gmail.com>
Co-authored-by: ver217 <lhx0217@gmail.com>
This commit is contained in:
@@ -1,5 +1,5 @@
|
||||
from .cosine import CosineAnnealingLR, CosineAnnealingWarmupLR, FlatAnnealingLR, FlatAnnealingWarmupLR
|
||||
from .linear import LinearWarmupLR, LinearWarmupDecay
|
||||
from .linear import LinearWarmupLR
|
||||
from .multistep import MultiStepLR, MultiStepWarmupLR
|
||||
from .onecycle import OneCycleLR
|
||||
from .poly import PolynomialLR, PolynomialWarmupLR
|
||||
|
Reference in New Issue
Block a user