This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-05-02 16:43:24 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
d7ea63992bd2b9dd7de0ba03633d5345b17ac549
ColossalAI
/
colossalai
/
context
History
Jiarui Fang
65c0f380c2
[format] polish name format for MOE (
#481
)
2022-03-21 23:19:47 +08:00
..
process_group_initializer
[MOE] add unitest for MOE experts layout, gradient handler and kernel (
#469
)
2022-03-21 13:35:04 +08:00
random
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
__init__.py
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
config.py
update default logger (
#100
) (
#101
)
2022-01-04 20:03:26 +08:00
moe_context.py
[format] polish name format for MOE (
#481
)
2022-03-21 23:19:47 +08:00
parallel_context.py
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
parallel_mode.py
adapted for sequence parallel (
#163
)
2022-01-20 13:44:51 +08:00