This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-18 16:46:08 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
ea13a201bbd7eb6022069c8379f3626f9788b0f9
ColossalAI
/
colossalai
/
nn
/
parallel
History
HELSON
ea13a201bb
[polish] polish code for get_static_torch_model (
#2405
)
...
* [gemini] polish code * [testing] remove code * [gemini] make more robust
2023-01-09 17:41:38 +08:00
..
layers
[embedding] rename FreqAwareEmbedding -> CachedEmbedding (
#1699
)
2022-10-13 22:22:27 +08:00
__init__.py
[Gemini] make gemini usage simple (
#1821
)
2022-11-08 15:53:13 +08:00
data_parallel.py
[polish] polish code for get_static_torch_model (
#2405
)
2023-01-09 17:41:38 +08:00
gemini_parallel.py
[Gemini] chunk init using runtime visited param order (
#2115
)
2022-12-12 18:06:16 +08:00
reducer.py
[ddp] ColoDDP uses bucket all-reduce (
#1177
)
2022-06-29 10:34:13 +08:00
utils.py
[polish] polish code for get_static_torch_model (
#2405
)
2023-01-09 17:41:38 +08:00