This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-04-26 01:35:21 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
b55deb0662005e5db37075163a38487ff006eb68
ColossalAI
/
colossalai
/
nn
/
parallel
History
HELSON
66dfcf5281
[gemini] update the gpt example (
#2527
)
2023-01-30 17:58:05 +08:00
..
layers
[embedding] rename FreqAwareEmbedding -> CachedEmbedding (
#1699
)
2022-10-13 22:22:27 +08:00
__init__.py
[zero] add zero wrappers (
#2523
)
2023-01-29 17:52:58 +08:00
data_parallel.py
[gemini] update ddp strict mode (
#2518
)
2023-01-28 14:35:25 +08:00
gemini_parallel.py
[gemini] update ddp strict mode (
#2518
)
2023-01-28 14:35:25 +08:00
reducer.py
[ddp] ColoDDP uses bucket all-reduce (
#1177
)
2022-06-29 10:34:13 +08:00
utils.py
[polish] polish code for get_static_torch_model (
#2405
)
2023-01-09 17:41:38 +08:00
zero_wrapper.py
[gemini] update the gpt example (
#2527
)
2023-01-30 17:58:05 +08:00