Commit Graph

3859 Commits

Author SHA1 Message Date
duanjunwen
35a7b636b3 [fix] fix mem assertation 2024-09-09 05:41:39 +00:00
duanjunwen
400e5e5b23 [fix] mem assertation' 2024-09-09 02:58:06 +00:00
duanjunwen
4a358348c7 [fix] fix mem check; 2024-09-04 10:57:38 +00:00
duanjunwen
2f09c374f3 [feat] add memory assertation; 2024-09-04 06:34:18 +00:00
duanjunwen
e6e1a97a6d [fix] fix requir grad position and detach position and input&output local buffer append position; 2024-09-04 03:31:08 +00:00
duanjunwen
20503cdfdf [fix] rm requir_grad for output; 2024-09-03 09:24:40 +00:00
duanjunwen
b4103f125c [fix] fix detach output & release output; 2024-09-03 09:09:41 +00:00
duanjunwen
4c1f81c683 [fix] fix bwd step if condition; remove useless comments and format info; 2024-09-03 08:56:08 +00:00
Hongxin Liu
26e553937b
[fp8] fix linear hook (#6046) 2024-09-03 16:37:16 +08:00
Hongxin Liu
c3b5caff0e
[fp8] optimize all-gather (#6043)
* [fp8] optimize all-gather

* [fp8] fix all gather fp8 ring

* [fp8] enable compile

* [fp8] fix all gather fp8 ring
2024-09-03 15:45:17 +08:00
duanjunwen
ab643c9af7 [fix] rm output.data after send fwd; 2024-09-03 14:12:17 +08:00
duanjunwen
a48afc4a66 [fix] fix optim bwd; 2024-09-03 02:40:26 +00:00
Tong Li
c650a906db
[Hotfix] Remove deprecated install (#6042)
* remove deprecated install

* remove unused folder
2024-09-03 10:33:18 +08:00
duanjunwen
591a13bf7e [fix] fix optim bwd; 2024-09-02 11:19:42 +00:00
duanjunwen
77fe44286c [fix] rm zbv in hybridplugin 2024-09-02 10:00:43 +00:00
duanjunwen
6d18d38d5c [feat] update test; rm comments; 2024-09-02 09:50:47 +00:00
Gao, Ruiyuan
e9032fb0b2
[colossalai/checkpoint_io/...] fix bug in load_state_dict_into_model; format error msg (#6020)
* fix bug in load_state_dict_into_model; format error msg

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update utils.py

to support checking missing_keys

* Update general_checkpoint_io.py

fix bug in missing_keys error message

* retrigger tests

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-09-02 16:56:35 +08:00
duanjunwen
a7b767b071 [fix] fix communication_map; 2024-08-30 05:56:02 +00:00
duanjunwen
8eb6eac225 [fix] fix optim bwd; add license for v_schedule; remove redundant attributes; fix schedule loop "while"--> "for"; add communication dict; 2024-08-30 05:42:43 +00:00
duanjunwen
6af81d8c0d [feat] add fwd_bwd_step, run_fwd_only; 2024-08-30 02:47:52 +00:00
duanjunwen
48ba22dbfd [feat] fix optimizer bwd b & w; support return accum loss & output 2024-08-29 08:54:45 +00:00
Guangyao Zhang
e96a0761ea
[FP8] unsqueeze scale to make it compatible with torch.compile (#6040) 2024-08-29 14:49:23 +08:00
duanjunwen
4c4b01b859 [feat] add optim backward_b_by_grad 2024-08-29 03:16:59 +00:00
Tong Li
0d3a85d04f
add fused norm (#6038) 2024-08-28 17:12:51 +08:00
Tong Li
4a68efb7da
[Colossal-LLaMA] Refactor latest APIs (#6030)
* refactor latest code

* update api

* add dummy dataset

* update Readme

* add setup

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update files

* add PP support

* update arguments

* update argument

* reorg folder

* update version

* remove IB infor

* update utils

* update readme

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update save for zero

* update save

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* add apex

* update

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-28 17:01:58 +08:00
duanjunwen
b1419ef76a [fix] fix poc test; add comments in poc; 2024-08-28 05:47:53 +00:00
duanjunwen
582ba0d6ff [feat] fix func name & ci; add comments; 2024-08-28 03:40:50 +00:00
duanjunwen
b5f7b4d228 [feat] fix poc format 2024-08-28 03:08:35 +00:00
duanjunwen
d6e3d7d2a3 [feat] fix ci; add assert; 2024-08-28 02:41:05 +00:00
duanjunwen
29383b2de0 [fix] update 2024-08-28 02:33:42 +00:00
Hongxin Liu
cc1b0efc17
[plugin] hotfix zero plugin (#6036)
* [plugin] hotfix zero plugin

* [plugin] hotfix zero plugin
2024-08-28 10:16:48 +08:00
duanjunwen
fe209164f1 [feat] add apply v_schedule graph; p & p.grad assert err exist; 2024-08-27 10:29:39 +00:00
duanjunwen
8b37323f16 [feat] add run_fwd_bwd_with_microbatch (replace input) & test; add p&p.grad assert close test & all pass; 2024-08-27 09:31:38 +00:00
duanjunwen
9e0bd1af00 [fix] fix ci test; add pytest; 2024-08-27 08:00:23 +00:00
duanjunwen
283c9ff5d2 [fix] rm useless assign and comments; 2024-08-27 07:31:58 +00:00
duanjunwen
1b4bb2beeb [feat] add comments for ZBV func; 2024-08-27 07:11:50 +00:00
duanjunwen
f1c1a87246 [feat] add test for p & p grad; 2024-08-27 06:37:26 +00:00
duanjunwen
5e09c8b4e1 [feat] split communication and calculation; fix pop empty send_bwd_buffer error; 2024-08-27 06:29:13 +00:00
Wenxuan Tan
d383449fc4
[CI] Remove triton version for compatibility bug; update req torch >=2.2 (#6018)
* remove triton version

* remove torch 2.2

* remove torch 2.1

* debug

* remove 2.1 build tests

* require torch >=2.2

---------

Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-27 10:12:21 +08:00
Hongxin Liu
17904cb5bf
Merge pull request #6012 from hpcaitech/feature/fp8_comm
[fp8]  support fp8 communication and fp8 training for Colossalai
2024-08-27 10:09:43 +08:00
duanjunwen
1d75045c37 [feat] add test run_fwd_bwd automatic scheduling; 2024-08-26 11:21:56 +00:00
Wang Binluo
4a6f31eb0c
Merge pull request #6033 from wangbluo/fix
[fp8] fix the merge
2024-08-26 14:06:06 +08:00
duanjunwen
fd5526b76e Merge branch 'main' into dev/zero_bubble 2024-08-26 04:03:20 +00:00
duanjunwen
107230d27a [update] update text; 2024-08-26 04:00:51 +00:00
pre-commit-ci[bot]
80d24ae519 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-26 03:48:43 +00:00
wangbluo
dae39999d7 fix 2024-08-26 03:45:42 +00:00
duanjunwen
203033ea16 [fix] fix weight not close; 2024-08-23 08:57:27 +00:00
Wenxuan Tan
7cf9df07bc
[Hotfix] Fix llama fwd replacement bug (#6031)
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-23 15:44:27 +08:00
duanjunwen
c18ef060cf [feat] add dw test; 2024-08-23 06:04:12 +00:00
Wang Binluo
0bf46c54af
Merge pull request #6029 from hpcaitech/flybird11111-patch-1
Update train_dpo.py
2024-08-23 13:50:04 +08:00