This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-17 07:58:15 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
dfca9678fa3fd35ef185370289a3aabffb8dbf85
ColossalAI
/
colossalai
/
booster
/
plugin
History
Baizhou Zhang
0bb0b481b4
[gemini] fix argument naming during chunk configuration searching
2023-06-25 13:34:15 +08:00
..
__init__.py
[booster] support torch fsdp plugin in booster (
#3697
)
2023-05-15 12:14:38 +08:00
dp_plugin_base.py
[booster] update prepare dataloader method for plugin (
#3706
)
2023-05-08 15:44:03 +08:00
gemini_plugin.py
[gemini] fix argument naming during chunk configuration searching
2023-06-25 13:34:15 +08:00
low_level_zero_plugin.py
[booster] make optimizer argument optional for boost (
#3993
)
2023-06-15 17:38:42 +08:00
plugin_base.py
[booster] make optimizer argument optional for boost (
#3993
)
2023-06-15 17:38:42 +08:00
torch_ddp_plugin.py
[checkpointio] sharded optimizer checkpoint for DDP plugin (
#4002
)
2023-06-16 14:14:05 +08:00
torch_fsdp_plugin.py
[booster] make optimizer argument optional for boost (
#3993
)
2023-06-15 17:38:42 +08:00