This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-14 03:56:03 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
d0807122e2412e6633db7db027ff60827ca8fe9f
ColossalAI
/
colossalai
/
booster
/
plugin
History
LuGY
1a49a5ea00
[zero] support shard optimizer state dict of zero (
#4194
)
...
* support shard optimizer of zero * polish code * support sync grad manually
2023-07-31 22:13:29 +08:00
..
__init__.py
[booster] support torch fsdp plugin in booster (
#3697
)
2023-05-15 12:14:38 +08:00
dp_plugin_base.py
[booster] update prepare dataloader method for plugin (
#3706
)
2023-05-08 15:44:03 +08:00
gemini_plugin.py
[zero]support no_sync method for zero1 plugin (
#4138
)
2023-07-31 22:13:29 +08:00
low_level_zero_plugin.py
[zero] support shard optimizer state dict of zero (
#4194
)
2023-07-31 22:13:29 +08:00
plugin_base.py
[zero]support no_sync method for zero1 plugin (
#4138
)
2023-07-31 22:13:29 +08:00
torch_ddp_plugin.py
[zero]support no_sync method for zero1 plugin (
#4138
)
2023-07-31 22:13:29 +08:00
torch_fsdp_plugin.py
[zero]support no_sync method for zero1 plugin (
#4138
)
2023-07-31 22:13:29 +08:00