This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-04-25 01:03:35 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
80a8ca916a740e913cbedf60caeadc0bab5cb4fa
ColossalAI
/
tests
/
test_shardformer
History
Wang Binluo
dcd41d0973
Merge pull request
#6071
from wangbluo/ring_attention
...
[Ring Attention] fix the 2d ring attn when using multiple machine
2024-10-15 15:17:21 +08:00
..
test_hybrid_parallel_grad_clip_norm
[MoE/ZeRO] Moe refactor with zero refactor (
#5821
)
2024-06-28 14:00:08 +08:00
test_layer
Merge pull request
#6071
from wangbluo/ring_attention
2024-10-15 15:17:21 +08:00
test_model
[moe] add parallel strategy for shared_expert && fix test for deepseek (
#6063
)
2024-09-18 10:09:01 +08:00
__init__.py
…
test_flash_attention.py
[Feature] Zigzag Ring attention (
#5905
)
2024-08-16 13:56:38 +08:00
test_shard_utils.py
…
test_with_torch_ddp.py
…