Commit Graph

3589 Commits

Author SHA1 Message Date
duanjunwen
11ae6848c6
[zerobubble]Support ZeroBubble Pipeline (#6034)
* [feat] add zerobubble pp (just a frame now); add POC test for dx_dw; add test for zerobubble;

* [feat] add dw test;

* [fix] fix weight not close;

* [update] update text;

* [feat] add test run_fwd_bwd automatic scheduling;

* [feat] split communication and calculation; fix pop empty send_bwd_buffer error;

* [feat] add test for p & p grad;

* [feat] add comments for ZBV func;

* [fix] rm useless assign and comments;

* [fix] fix ci test; add pytest;

* [feat] add run_fwd_bwd_with_microbatch  (replace input) & test; add p&p.grad assert close test & all pass;

* [feat] add apply v_schedule graph; p & p.grad assert err exist;

* [fix] update

* [feat] fix ci; add assert;

* [feat] fix poc format

* [feat] fix func name & ci; add comments;

* [fix] fix poc test; add comments in poc;

* [feat] add optim backward_b_by_grad

* [feat] fix optimizer bwd b & w; support return accum loss & output

* [feat] add fwd_bwd_step, run_fwd_only;

* [fix] fix optim bwd; add license for v_schedule; remove redundant attributes; fix schedule loop "while"--> "for"; add communication dict;

* [fix] fix communication_map;

* [feat] update test; rm comments;

* [fix] rm zbv in hybridplugin

* [fix] fix optim bwd;

* [fix] fix optim bwd;

* [fix] rm output.data after send fwd;

* [fix] fix bwd step if condition; remove useless comments and format info;

* [fix] fix detach output & release output;

* [fix] rm requir_grad for output;

* [fix] fix requir grad position and detach position and input&output local buffer append position;

* [feat] add memory assertation;

* [fix] fix mem check;

* [fix] mem assertation'

* [fix] fix mem assertation

* [fix] fix mem; use a new model shape; only assert mem less and equal than theo;

* [fix] fix model zoo import;

* [fix] fix redundant detach & clone; add buffer assertation in the end;

* [fix] add output_obj_grad assert None at bwd b step; replace input_obj.require_grad_ with treemap;

* [fix] update optim state dict assert (include param group & state); fix mem assert after add optim;

* [fix] add testcase with microbatch 4;
2024-09-10 17:33:09 +08:00
Wenxuan Tan
7cf9df07bc
[Hotfix] Fix llama fwd replacement bug (#6031)
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-23 15:44:27 +08:00
Tong Li
39e2597426
[ColossalChat] Add PP support (#6001)
* support pp training

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update rm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update test case

* fix

* change to 4

* fix eval

* test

* add pp

* hotfix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* support pp training

* update rm

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* refactor

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update test case

* fix

* change to 4

* fix eval

* test

* add pp

* hotfix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update

* skip pp eval

* update all reduce

* update sft

* update ignore

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update no cache

* add eval

* remove fi

* remove debug

* remove parentheses to avoid warning

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Revert "add eval"

This reverts commit 3ab2f6fa32.

* add all reduce

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-21 10:47:39 +08:00
Hongxin Liu
0d3b0bd864
[plugin] add cast inputs option for zero (#6003) (#6022) 2024-08-21 10:21:26 +08:00
Edenzzzz
dcc44aab8d
[misc] Use dist logger in plugins (#6011)
* use dist logger in plugins

* remove trash

* print on rank 0

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-20 10:32:41 +08:00
Edenzzzz
f1c3266a94
overlap kv comm with output rescale (#6017)
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-19 14:08:17 +08:00
Hongxin Liu
26493b97d3
[misc] update compatibility (#6008)
* [misc] update compatibility

* [misc] update requirements

* [devops] disable requirements cache

* [test] fix torch ddp test

* [test] fix rerun on address in use

* [test] fix lazy init
2024-08-16 18:49:14 +08:00
Edenzzzz
f5c84af0b0
[Feature] Zigzag Ring attention (#5905)
* halfway

* fix cross-PP-stage position id length diff bug

* fix typo

* fix typo

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* unified cross entropy func for all shardformer models

* remove redundant lines

* add basic ring attn; debug cross entropy

* fwd bwd logic complete

* fwd bwd logic complete; add experimental triton rescale

* precision tests passed

* precision tests passed

* fix typos and remove misc files

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add sp_mode to benchmark; fix varlen interface

* update softmax_lse shape by new interface

* change tester name

* remove buffer clone; support packed seq layout

* add varlen tests

* fix typo

* all tests passed

* add dkv_group; fix mask

* remove debug statements

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-16 13:56:38 +08:00
Haze188
887d2d579b
[misc] Bypass the huggingface bug to solve the mask mismatch problem (#5991) 2024-08-15 14:40:26 +08:00
pre-commit-ci[bot]
4dd03999ec
[pre-commit.ci] pre-commit autoupdate (#5995)
updates:
- [github.com/psf/black-pre-commit-mirror: 24.4.2 → 24.8.0](https://github.com/psf/black-pre-commit-mirror/compare/24.4.2...24.8.0)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-15 14:40:03 +08:00
Hongxin Liu
406f984063
[plugin] add cast inputs option for zero (#6003) 2024-08-15 10:41:22 +08:00
Tong Li
ceb1e262e7
fix sync condition (#6000) 2024-08-14 11:22:39 +08:00
YeAnbang
ed97d3a5d3
[Chat] fix readme (#5989)
* fix readme

* fix readme, tokenization fully tested

* fix readme, tokenization fully tested

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: root <root@notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9-0.notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9.colossal-ai.svc.cluster.local>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-12 14:55:17 +08:00
Edenzzzz
b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985)
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-09 18:17:09 +08:00
Tong Li
ad3fa4f49c
[Hotfix] README link (#5966)
* update ignore

* update readme

* run style

* update readme
2024-08-08 18:04:47 +08:00
Edenzzzz
9179d4088e
[Docs] clarify launch port
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-07 13:53:48 +08:00
YeAnbang
fe71917851
Merge pull request #5962 from hpcaitech/colossalchat
[Chat] Support overall loss, update KTO logging
2024-08-02 17:32:41 +08:00
YeAnbang
0b2d55c4ab Support overall loss, update KTO logging 2024-08-02 06:51:38 +00:00
Wang Binluo
75c963686f
[lora] lora support hybrid parallel plugin (#5956)
* lora support hybrid plugin

* fix

* fix

* fix

* fix
2024-08-02 10:36:58 +08:00
Tong Li
19d1510ea2
[feat] Dist Loader for Eval (#5950)
* support auto distributed data loader

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* support auto distributed data loader

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix tp error

* remove unused parameters

* remove unused

* update inference

* update docs

* update inference

---------

Co-authored-by: Michelle <qianranma8@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-02 10:06:25 +08:00
botbw
62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens 2024-08-01 10:06:59 +08:00
botbw
d1d1ab871e [moe] solve dp axis issue 2024-08-01 10:06:59 +08:00
botbw
65daa87627 [doc] add MoeHybridParallelPlugin docstring 2024-08-01 10:06:59 +08:00
hxwang
7bedd03739 [moe] remove force_overlap_comm flag and add warning instead 2024-08-01 10:06:59 +08:00
hxwang
f7c5485ed6 [chore] docstring 2024-08-01 10:06:59 +08:00
haze188
7e737df5ad [misc] remove useless condition 2024-08-01 10:06:59 +08:00
haze188
70793ce9ed [misc] fix ci failure: change default value to false in moe plugin 2024-08-01 10:06:59 +08:00
haze188
12d043ca00 [misc] remove incompatible test config 2024-08-01 10:06:59 +08:00
hxwang
606b0891ed [chore] change moe_pg_mesh to private 2024-08-01 10:06:59 +08:00
hxwang
5b4c12381b Revert "[moe] implement submesh initialization"
This reverts commit 2f9bce6686.
2024-08-01 10:06:59 +08:00
hxwang
cb01c0d5ce [moe] refactor mesh assignment 2024-08-01 10:06:59 +08:00
haze188
034020bd04 [misc] remove debug/print code 2024-08-01 10:06:59 +08:00
haze188
59bcf56c60 [misc] skip redunant test 2024-08-01 10:06:59 +08:00
hxwang
c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers) 2024-08-01 10:06:59 +08:00
hxwang
6c39f0b144 [test] add check 2024-08-01 10:06:59 +08:00
haze188
b2952a5982 [moe] deepseek moe sp support 2024-08-01 10:06:59 +08:00
botbw
96d0fbc531 [bug] fix: somehow logger hangs the program 2024-08-01 10:06:59 +08:00
hxwang
067e18f7e9 [test] fix test: test_zero1_2 2024-08-01 10:06:59 +08:00
hxwang
74b03de3f9 [moe] remove ops 2024-08-01 10:06:59 +08:00
hxwang
70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 2024-08-01 10:06:59 +08:00
pre-commit-ci[bot]
52d346f2a5 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-01 10:06:59 +08:00
hxwang
46037c2ccd [chore] minor fix after rebase 2024-08-01 10:06:59 +08:00
hxwang
803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 2024-08-01 10:06:59 +08:00
hxwang
7077d38d5a [moe] finalize test (no pp) 2024-08-01 10:06:59 +08:00
haze188
2cddeac717 moe sp + ep bug fix 2024-08-01 10:06:59 +08:00
hxwang
877d94bb8c [moe] init moe plugin comm setting with sp 2024-08-01 10:06:59 +08:00
hxwang
09d6280d3e [chore] minor fix 2024-08-01 10:06:59 +08:00
Haze188
404b16faf3 [Feature] MoE Ulysses Support (#5918)
* moe sp support

* moe sp bug solve

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-01 10:06:59 +08:00
hxwang
3e2b6132b7 [moe] clean legacy code 2024-08-01 10:06:59 +08:00
hxwang
74eccac0db [moe] test deepseek 2024-08-01 10:06:59 +08:00