This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-02-21 14:32:09 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
73f4dc578e98c00a618260089cc3eb7f7210edfe
ColossalAI
/
tests
/
test_shardformer
History
Zhongkai Zhao
5d9a0ae75b
[hotfix] Fix ShardFormer test execution path when using sequence parallelism (
#5230
)
2024-01-17 17:42:29 +08:00
..
test_hybrid_parallel_grad_clip_norm
[gemini] gemini support extra-dp (
#5043
)
2023-11-16 21:03:04 +08:00
test_layer
[shardformer] llama support DistCrossEntropy (
#5176
)
2023-12-13 01:39:14 +08:00
test_model
[hotfix] Fix ShardFormer test execution path when using sequence parallelism (
#5230
)
2024-01-17 17:42:29 +08:00
__init__.py
[shardformer] adapted T5 and LLaMa test to use kit (
#4049
)
2023-07-04 16:05:01 +08:00
test_shard_utils.py
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py
[ci] fixed ddp test (
#5254
)
2024-01-11 17:16:32 +08:00