This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-18 16:46:08 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
b528eea0f05162bfedcd06381b953193c2a91b82
ColossalAI
/
colossalai
/
nn
/
parallel
History
HELSON
b528eea0f0
[zero] add zero wrappers (
#2523
)
...
* [zero] add zero wrappers * change names * add wrapper functions to init
2023-01-29 17:52:58 +08:00
..
layers
[embedding] rename FreqAwareEmbedding -> CachedEmbedding (
#1699
)
2022-10-13 22:22:27 +08:00
__init__.py
[zero] add zero wrappers (
#2523
)
2023-01-29 17:52:58 +08:00
data_parallel.py
[gemini] update ddp strict mode (
#2518
)
2023-01-28 14:35:25 +08:00
gemini_parallel.py
[gemini] update ddp strict mode (
#2518
)
2023-01-28 14:35:25 +08:00
reducer.py
[ddp] ColoDDP uses bucket all-reduce (
#1177
)
2022-06-29 10:34:13 +08:00
utils.py
[polish] polish code for get_static_torch_model (
#2405
)
2023-01-09 17:41:38 +08:00
zero_wrapper.py
[zero] add zero wrappers (
#2523
)
2023-01-29 17:52:58 +08:00