This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-20 09:34:48 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
d6b01feb662c5482cacd61d48eee626252619b06
ColossalAI
/
colossalai
/
zero
History
HELSON
f7f2248771
[moe] fix MoE bugs (
#1628
)
...
* remove forced FP32 modules * correct no_shard-contexts' positions
2022-09-22 13:56:30 +08:00
..
init_ctx
[moe] fix MoE bugs (
#1628
)
2022-09-22 13:56:30 +08:00
shard_utils
[gemini] add GeminiMemoryManger (
#832
)
2022-04-24 13:08:48 +08:00
sharded_model
[NFC] polish colossalai/zero/sharded_model/reduce_scatter.py code style (
#1554
)
2022-09-08 22:11:04 +08:00
sharded_optim
fix move fp32 shards (
#1604
)
2022-09-16 17:33:16 +08:00
sharded_param
[gemini] add GeminiMemoryManger (
#832
)
2022-04-24 13:08:48 +08:00
utils
[hotfix] remove potiential circle import (
#1307
)
2022-07-14 13:44:26 +08:00
__init__.py
[zero] add zero optimizer for ColoTensor (
#1046
)
2022-06-02 12:13:15 +08:00
zero_optimizer.py
[utils] Impl clip_grad_norm for ColoTensor and ZeroOptimizer (
#1442
)
2022-08-11 22:58:58 +08:00