This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-25 23:04:48 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
8d3250d74b2e45d9c9be905ed8e0a01472982301
ColossalAI
/
colossalai
/
context
History
HELSON
7544347145
[MOE] add unitest for MOE experts layout, gradient handler and kernel (
#469
)
2022-03-21 13:35:04 +08:00
..
process_group_initializer
[MOE] add unitest for MOE experts layout, gradient handler and kernel (
#469
)
2022-03-21 13:35:04 +08:00
random
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
__init__.py
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
config.py
update default logger (
#100
) (
#101
)
2022-01-04 20:03:26 +08:00
moe_context.py
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
parallel_context.py
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
parallel_mode.py
adapted for sequence parallel (
#163
)
2022-01-20 13:44:51 +08:00