1
0
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2025-05-09 17:08:23 +00:00
ColossalAI/colossalai
duanjunwen c2fe3137e2
[hotfix] fix flash attn window_size err ()
* [fix] fix flash attn

* [hotfix] fix flash-atten version

* [fix] fix flash_atten version

* [fix] fix flash-atten versions

* [fix] fix flash-attn not enough values to unpack error

* [fix] fix test_ring_attn

* [fix] fix test ring attn
2024-11-14 17:11:35 +08:00
..
_analyzer [test] Fix/fix testcase () 2024-06-03 15:26:01 +08:00
_C Clean up 2024-06-07 09:09:29 +00:00
accelerator [misc] fit torch api upgradation and remove legecy import () 2024-10-18 16:48:52 +08:00
amp [plugin] support get_grad_norm () 2024-11-05 18:12:47 +08:00
auto_parallel [pre-commit.ci] pre-commit autoupdate () 2024-07-01 17:16:41 +08:00
autochunk [hotfix] Fix examples no pad token & auto parallel codegen bug; () 2024-04-18 18:15:50 +08:00
booster [zero] support extra dp () 2024-11-12 11:20:46 +08:00
checkpoint_io [checkpointio] fix hybrid plugin model save () 2024-10-31 17:04:53 +08:00
cli [devops] fix extention building () 2024-03-05 15:35:54 +08:00
cluster [FP8] rebase main () 2024-08-06 16:29:37 +08:00
context [Fix]: implement thread-safety singleton to avoid deadlock for very large-scale training scenarios () 2024-04-25 14:45:52 +08:00
device [Feature] Distributed optimizers: Lamb, Galore, CAME and Adafactor () 2024-05-14 13:52:45 +08:00
fx [test] Fix/fix testcase () 2024-06-03 15:26:01 +08:00
inference [shardformer] fix linear 1d row and support uneven splits for fused qkv linear () 2024-10-10 14:34:45 +08:00
interface [plugin] support get_grad_norm () 2024-11-05 18:12:47 +08:00
kernel [misc] fit torch api upgradation and remove legecy import () 2024-10-18 16:48:52 +08:00
lazy [fp8] Merge feature/fp8_comm to main branch of Colossalai () 2024-08-22 09:21:34 +08:00
legacy [fp8] Merge feature/fp8_comm to main branch of Colossalai () 2024-08-22 09:21:34 +08:00
logging [fp8] Merge feature/fp8_comm to main branch of Colossalai () 2024-08-22 09:21:34 +08:00
moe [hotfix] moe hybrid parallelism benchmark & follow-up fix () 2024-09-10 17:30:53 +08:00
nn [misc] fix dist logger () 2024-06-05 15:04:22 +08:00
pipeline [misc] fit torch api upgradation and remove legecy import () 2024-10-18 16:48:52 +08:00
quantization [fp8] add fallback and make compile option configurable () 2024-10-18 13:55:31 +08:00
shardformer [hotfix] fix flash attn window_size err () 2024-11-14 17:11:35 +08:00
tensor [fp8] support fp8 amp for hybrid parallel plugin () 2024-08-07 18:21:08 +08:00
testing [fp8] Merge feature/fp8_comm to main branch of Colossalai () 2024-08-22 09:21:34 +08:00
utils [checkpointio] fix hybrid plugin model save () 2024-10-31 17:04:53 +08:00
zero [zero] support extra dp () 2024-11-12 11:20:46 +08:00
__init__.py [devops] remove post commit ci () 2024-04-08 15:09:40 +08:00
initialize.py [fp8] hotfix backward hook () 2024-09-11 16:11:25 +08:00