This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-02-21 14:32:09 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
dc583aa576c3e2c98925613ae5cc2f3a9147ce3d
ColossalAI
/
tests
/
test_shardformer
History
botbw
e28e05345b
[moe] implement submesh initialization
2024-08-01 10:06:59 +08:00
..
test_hybrid_parallel_grad_clip_norm
[MoE/ZeRO] Moe refactor with zero refactor (
#5821
)
2024-06-28 14:00:08 +08:00
test_layer
[misc] refactor launch API and tensor constructor (
#5666
)
2024-04-29 10:40:11 +08:00
test_model
[moe] implement submesh initialization
2024-08-01 10:06:59 +08:00
__init__.py
[shardformer] adapted T5 and LLaMa test to use kit (
#4049
)
2023-07-04 16:05:01 +08:00
test_flash_attention.py
[coloattention]modify coloattention (
#5627
)
2024-04-25 10:47:14 +08:00
test_shard_utils.py
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py
[misc] refactor launch API and tensor constructor (
#5666
)
2024-04-29 10:40:11 +08:00