[Gemini] add GeminiAdamOptimizer (#1960)

This commit is contained in:
Jiarui Fang
2022-11-16 14:44:28 +08:00
committed by GitHub
parent 7066dfbf82
commit f7e276fa71
12 changed files with 66 additions and 44 deletions

View File

@@ -16,8 +16,9 @@ class GeminiDDP(ZeroDDP):
force_outputs_fp32: bool = False,
search_range_mb: int = 32) -> None:
"""
A torch.Module warpper using ZeRODPP and Genimi.
A torch.Module warpper using ZeRO-DP and Genimi.
ZeRO is for parallel. Gemini is for memory management.
WARNING: The class will modify the module inline!
Example:
model is initialized under the context of ColoInitContext