This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-30 06:00:00 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
0b76b57cd64be15f2ae7e5ebd8dd4d327c4685c3
ColossalAI
/
colossalai
/
checkpoint_io
History
Hongxin Liu
e86127925a
[plugin] support all-gather overlap for hybrid parallel (
#5919
)
...
* [plugin] fixed all-gather overlap support for hybrid parallel
2024-07-18 15:33:03 +08:00
..
__init__.py
[MoE/ZeRO] Moe refactor with zero refactor (
#5821
)
2024-06-28 14:00:08 +08:00
checkpoint_io_base.py
[lora] add lora APIs for booster, support lora for TorchDDP (
#4981
)
2024-04-28 10:51:27 +08:00
general_checkpoint_io.py
[lora] add lora APIs for booster, support lora for TorchDDP (
#4981
)
2024-04-28 10:51:27 +08:00
hybrid_parallel_checkpoint_io.py
[plugin] support all-gather overlap for hybrid parallel (
#5919
)
2024-07-18 15:33:03 +08:00
index_file.py
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00
moe_checkpoint.py
[MoE/ZeRO] Moe refactor with zero refactor (
#5821
)
2024-06-28 14:00:08 +08:00
utils.py
[MoE/ZeRO] Moe refactor with zero refactor (
#5821
)
2024-06-28 14:00:08 +08:00