This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2025-12-24 04:52:45 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
d0ec221b3853ccefb2f1133b5fae2dc50fed7430
ColossalAI
/
tests
/
test_shardformer
History
duanjunwen
e76308c6e6
[fix] rm use_zbv flag in Shardconfig; rm debug info;
2024-10-16 03:25:04 +00:00
..
test_hybrid_parallel_grad_clip_norm
[MoE/ZeRO] Moe refactor with zero refactor (
#5821
)
2024-06-28 14:00:08 +08:00
test_layer
[Feature] Zigzag Ring attention (
#5905
)
2024-08-16 13:56:38 +08:00
test_model
[fix] rm use_zbv flag in Shardconfig; rm debug info;
2024-10-16 03:25:04 +00:00
__init__.py
[shardformer] adapted T5 and LLaMa test to use kit (
#4049
)
2023-07-04 16:05:01 +08:00
test_flash_attention.py
[Feature] Zigzag Ring attention (
#5905
)
2024-08-16 13:56:38 +08:00
test_shard_utils.py
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py
[misc] refactor launch API and tensor constructor (
#5666
)
2024-04-29 10:40:11 +08:00