This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-18 16:46:08 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
30b4fc0eb03f7ab60c6b14d6dca1ba78363bbf3e
ColossalAI
/
colossalai
/
zero
History
ver217
a45ddf2d5f
[hotfix] fix sharded optim step and clip_grad_norm (
#1226
)
2022-07-08 13:34:48 +08:00
..
init_ctx
[hotfix] fix zero init ctx numel (
#1128
)
2022-06-16 17:17:27 +08:00
shard_utils
[gemini] add GeminiMemoryManger (
#832
)
2022-04-24 13:08:48 +08:00
sharded_model
warmup ratio configration (
#1192
)
2022-06-30 15:23:50 +08:00
sharded_optim
[hotfix] fix sharded optim step and clip_grad_norm (
#1226
)
2022-07-08 13:34:48 +08:00
sharded_param
[gemini] add GeminiMemoryManger (
#832
)
2022-04-24 13:08:48 +08:00
utils
[refactor] move chunk and chunkmgr to directory gemini (
#1182
)
2022-06-29 13:31:02 +08:00
__init__.py
[zero] add zero optimizer for ColoTensor (
#1046
)
2022-06-02 12:13:15 +08:00
zero_optimizer.py
[zero] zero optim supports loading local state dict (
#1171
)
2022-06-24 17:25:57 +08:00