This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-04-26 17:53:08 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
b55deb0662005e5db37075163a38487ff006eb68
ColossalAI
/
colossalai
/
zero
/
sharded_optim
History
HELSON
b528eea0f0
[zero] add zero wrappers (
#2523
)
...
* [zero] add zero wrappers * change names * add wrapper functions to init
2023-01-29 17:52:58 +08:00
..
bookkeeping
[zero] add unit testings for hybrid parallelism (
#2486
)
2023-01-18 10:36:10 +08:00
__init__.py
[zero] migrate zero1&2 (
#1878
)
2022-11-11 09:26:40 +08:00
_utils.py
[zero] fix gradient clipping in hybrid parallelism (
#2521
)
2023-01-29 15:09:57 +08:00
low_level_optim.py
[zero] add zero wrappers (
#2523
)
2023-01-29 17:52:58 +08:00
sharded_optim_v2.py
fix move fp32 shards (
#1604
)
2022-09-16 17:33:16 +08:00