Logo
Explore Help
Register Sign In
github/ColossalAI
1
0
Fork 0
You've already forked ColossalAI
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2026-04-26 01:35:21 +00:00
Code Issues Packages Projects Releases Wiki Activity
Files
c2e8f61592011732eab54e2ffacd2de44fdd8096
ColossalAI/tests/test_shardformer
History
Wang Binluo dcd41d0973 Merge pull request #6071 from wangbluo/ring_attention
[Ring Attention] fix the 2d ring attn when using multiple machine
2024-10-15 15:17:21 +08:00
..
test_hybrid_parallel_grad_clip_norm
[MoE/ZeRO] Moe refactor with zero refactor (#5821)
2024-06-28 14:00:08 +08:00
test_layer
Merge pull request #6071 from wangbluo/ring_attention
2024-10-15 15:17:21 +08:00
test_model
[moe] add parallel strategy for shared_expert && fix test for deepseek (#6063)
2024-09-18 10:09:01 +08:00
__init__.py
[shardformer] adapted T5 and LLaMa test to use kit (#4049)
2023-07-04 16:05:01 +08:00
test_flash_attention.py
[Feature] Zigzag Ring attention (#5905)
2024-08-16 13:56:38 +08:00
test_shard_utils.py
[misc] update pre-commit and run all files (#4752)
2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py
[misc] refactor launch API and tensor constructor (#5666)
2024-04-29 10:40:11 +08:00
Powered by Gitea Version: 1.25.2 Page: 345ms Template: 9ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API