This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-24 22:35:06 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
d7ea63992bd2b9dd7de0ba03633d5345b17ac549
ColossalAI
/
colossalai
/
nn
/
layer
/
moe
History
HELSON
d7ea63992b
[MOE] add FP32LinearGate for MOE in NaiveAMP context (
#480
)
2022-03-22 10:50:20 +08:00
..
__init__.py
[MOE] changed parallelmode to dist process group (
#460
)
2022-03-19 13:46:29 +08:00
_operation.py
[MOE] polish moe_env (
#467
)
2022-03-19 15:36:25 +08:00
experts.py
[format] polish name format for MOE (
#481
)
2022-03-21 23:19:47 +08:00
layers.py
[MOE] add FP32LinearGate for MOE in NaiveAMP context (
#480
)
2022-03-22 10:50:20 +08:00
utils.py
[MOE] add FP32LinearGate for MOE in NaiveAMP context (
#480
)
2022-03-22 10:50:20 +08:00