This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-30 06:00:00 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
9664b1bc190c57518fd76f4a3740feea3dc38ffd
ColossalAI
/
colossalai
/
booster
History
Hongxin Liu
e86127925a
[plugin] support all-gather overlap for hybrid parallel (
#5919
)
...
* [plugin] fixed all-gather overlap support for hybrid parallel
2024-07-18 15:33:03 +08:00
..
mixed_precision
[npu] change device to accelerator api (
#5239
)
2024-01-09 10:20:05 +08:00
plugin
[plugin] support all-gather overlap for hybrid parallel (
#5919
)
2024-07-18 15:33:03 +08:00
__init__.py
[booster] implemented the torch ddd + resnet example (
#3232
)
2023-03-27 10:24:14 +08:00
accelerator.py
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00
booster.py
[Feature] qlora support (
#5586
)
2024-04-28 10:51:27 +08:00