This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-15 14:47:16 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
705f56107ced2a97af31fce22284be22269e720d
ColossalAI
/
colossalai
/
context
History
Jiarui Fang
a445e118cf
[polish] polish singleton and global context (
#500
)
2022-03-23 18:03:39 +08:00
..
process_group_initializer
[MOE] add unitest for MOE experts layout, gradient handler and kernel (
#469
)
2022-03-21 13:35:04 +08:00
random
add moe context, moe utilities and refactor gradient handler (
#455
)
2022-03-18 16:38:32 +08:00
__init__.py
[polish] polish singleton and global context (
#500
)
2022-03-23 18:03:39 +08:00
config.py
update default logger (
#100
) (
#101
)
2022-01-04 20:03:26 +08:00
moe_context.py
[polish] polish singleton and global context (
#500
)
2022-03-23 18:03:39 +08:00
parallel_context.py
[polish] polish singleton and global context (
#500
)
2022-03-23 18:03:39 +08:00
parallel_mode.py
[MOE] remove old MoE legacy (
#493
)
2022-03-22 17:37:16 +08:00
singleton_meta.py
[polish] polish singleton and global context (
#500
)
2022-03-23 18:03:39 +08:00