ColossalAI/colossalai/shardformer/layer
flybird11111 aaafb38851
[Device]Support npu (#6159)
* support npu

* support pretrain

support pretrain

fix

* support lora

fix

fix

* support chatglm

fix

fxi

fix

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

fix

fix

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

fix

[pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

fix

fix

fix

* Update train.py

* Update train.py

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-12-17 15:42:39 +08:00
..
__init__.py [Zerobubble] merge main. (#6142) 2024-11-19 19:00:36 +08:00
_operation.py [Zerobubble] merge main. (#6142) 2024-11-19 19:00:36 +08:00
attn.py [hotfix] fix flash attn window_size err (#6132) 2024-11-14 17:11:35 +08:00
dropout.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
embedding.py [fp8] support hybrid parallel plugin (#5982) 2024-08-12 18:17:05 +08:00
linear.py [Zerobubble] merge main. (#6142) 2024-11-19 19:00:36 +08:00
loss.py [Feature] Split cross-entropy computation in SP (#5959) 2024-09-10 12:06:50 +08:00
normalization.py [Device]Support npu (#6159) 2024-12-17 15:42:39 +08:00
parallel_module.py [shardformer] refactor embedding resize (#5603) 2024-04-18 16:10:18 +08:00
qkv_fused_linear.py [shardformer] optimize seq parallelism (#6086) 2024-10-11 13:44:40 +08:00
utils.py [Ring Attention] Improve comments (#6085) 2024-10-16 11:23:35 +08:00