This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-04-26 09:42:27 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
c972d653111dcfbd63cd22b26ddb7d3ee83a69ed
ColossalAI
/
colossalai
/
booster
/
plugin
History
梁爽
abe4f971e0
[NFC] polish colossalai/booster/plugin/low_level_zero_plugin.py code style (
#4256
)
...
Co-authored-by: supercooledith <
893754954@qq.com
>
2023-07-26 14:12:57 +08:00
..
__init__.py
[booster] support torch fsdp plugin in booster (
#3697
)
2023-05-15 12:14:38 +08:00
dp_plugin_base.py
[booster] update prepare dataloader method for plugin (
#3706
)
2023-05-08 15:44:03 +08:00
gemini_plugin.py
[checkpointio] Sharded Optimizer Checkpoint for Gemini Plugin (
#4302
)
2023-07-21 14:39:01 +08:00
low_level_zero_plugin.py
[NFC] polish colossalai/booster/plugin/low_level_zero_plugin.py code style (
#4256
)
2023-07-26 14:12:57 +08:00
plugin_base.py
[booster] make optimizer argument optional for boost (
#3993
)
2023-06-15 17:38:42 +08:00
torch_ddp_plugin.py
[checkpointio] sharded optimizer checkpoint for DDP plugin (
#4002
)
2023-06-16 14:14:05 +08:00
torch_fsdp_plugin.py
[booster] make optimizer argument optional for boost (
#3993
)
2023-06-15 17:38:42 +08:00