Commit Graph

3591 Commits

Author SHA1 Message Date
Tong Li
74ee10e77d test 2024-08-13 07:32:25 +00:00
Tong Li
0b2b454b97 fix eval 2024-08-13 06:48:54 +00:00
Tong Li
4a5bfc55a6 change to 4 2024-08-13 04:02:21 +00:00
Tong Li
8806efd047
Merge branch 'hpcaitech:main' into coati/support-pp 2024-08-13 11:59:53 +08:00
Tong Li
8ce504d05c fix 2024-08-13 02:47:52 +00:00
Tong Li
a8356da3c7 update test case 2024-08-13 02:45:53 +00:00
pre-commit-ci[bot]
49f7428cbf [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-12 11:36:43 +00:00
Tong Li
641867018f Merge branch 'coati/support-pp' of github.com:TongLi3701/ColossalAI into coati/support-pp 2024-08-12 11:35:40 +00:00
Tong Li
7d9907f0ae refactor 2024-08-12 11:35:14 +00:00
pre-commit-ci[bot]
2c926141f3 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-12 11:29:22 +00:00
Tong Li
56fd2dc5d2 Merge branch 'coati/support-pp' of github.com:TongLi3701/ColossalAI into coati/support-pp 2024-08-12 11:28:16 +00:00
Tong Li
123107ff28 update rm 2024-08-12 11:27:42 +00:00
pre-commit-ci[bot]
515f8e4a43 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-12 10:15:35 +00:00
Tong Li
4a541aa27c support pp training 2024-08-12 10:13:03 +00:00
YeAnbang
ed97d3a5d3
[Chat] fix readme (#5989)
* fix readme

* fix readme, tokenization fully tested

* fix readme, tokenization fully tested

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: root <root@notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9-0.notebook-8f919155-6035-47b4-9c6f-1be133b9e2c9.colossal-ai.svc.cluster.local>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-12 14:55:17 +08:00
Edenzzzz
b4d2377d4c
[Hotfix] Avoid fused RMSnorm import error without apex (#5985)
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-09 18:17:09 +08:00
Tong Li
ad3fa4f49c
[Hotfix] README link (#5966)
* update ignore

* update readme

* run style

* update readme
2024-08-08 18:04:47 +08:00
Edenzzzz
9179d4088e
[Docs] clarify launch port
Co-authored-by: Edenzzzz <wtan45@wisc.edu>
2024-08-07 13:53:48 +08:00
YeAnbang
fe71917851
Merge pull request #5962 from hpcaitech/colossalchat
[Chat] Support overall loss, update KTO logging
2024-08-02 17:32:41 +08:00
YeAnbang
0b2d55c4ab Support overall loss, update KTO logging 2024-08-02 06:51:38 +00:00
Wang Binluo
75c963686f
[lora] lora support hybrid parallel plugin (#5956)
* lora support hybrid plugin

* fix

* fix

* fix

* fix
2024-08-02 10:36:58 +08:00
Tong Li
19d1510ea2
[feat] Dist Loader for Eval (#5950)
* support auto distributed data loader

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* support auto distributed data loader

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix tp error

* remove unused parameters

* remove unused

* update inference

* update docs

* update inference

---------

Co-authored-by: Michelle <qianranma8@gmail.com>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-02 10:06:25 +08:00
botbw
62cdac6b7b [chore] remove redundant test case, print string & reduce test tokens 2024-08-01 10:06:59 +08:00
botbw
d1d1ab871e [moe] solve dp axis issue 2024-08-01 10:06:59 +08:00
botbw
65daa87627 [doc] add MoeHybridParallelPlugin docstring 2024-08-01 10:06:59 +08:00
hxwang
7bedd03739 [moe] remove force_overlap_comm flag and add warning instead 2024-08-01 10:06:59 +08:00
hxwang
f7c5485ed6 [chore] docstring 2024-08-01 10:06:59 +08:00
haze188
7e737df5ad [misc] remove useless condition 2024-08-01 10:06:59 +08:00
haze188
70793ce9ed [misc] fix ci failure: change default value to false in moe plugin 2024-08-01 10:06:59 +08:00
haze188
12d043ca00 [misc] remove incompatible test config 2024-08-01 10:06:59 +08:00
hxwang
606b0891ed [chore] change moe_pg_mesh to private 2024-08-01 10:06:59 +08:00
hxwang
5b4c12381b Revert "[moe] implement submesh initialization"
This reverts commit 2f9bce6686.
2024-08-01 10:06:59 +08:00
hxwang
cb01c0d5ce [moe] refactor mesh assignment 2024-08-01 10:06:59 +08:00
haze188
034020bd04 [misc] remove debug/print code 2024-08-01 10:06:59 +08:00
haze188
59bcf56c60 [misc] skip redunant test 2024-08-01 10:06:59 +08:00
hxwang
c3dc9b4dba [deepseek] replace attn (a workaround for bug in transformers) 2024-08-01 10:06:59 +08:00
hxwang
6c39f0b144 [test] add check 2024-08-01 10:06:59 +08:00
haze188
b2952a5982 [moe] deepseek moe sp support 2024-08-01 10:06:59 +08:00
botbw
96d0fbc531 [bug] fix: somehow logger hangs the program 2024-08-01 10:06:59 +08:00
hxwang
067e18f7e9 [test] fix test: test_zero1_2 2024-08-01 10:06:59 +08:00
hxwang
74b03de3f9 [moe] remove ops 2024-08-01 10:06:59 +08:00
hxwang
70c9924d0d [chore] solve moe ckpt test failure and some other arg pass failure 2024-08-01 10:06:59 +08:00
pre-commit-ci[bot]
52d346f2a5 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2024-08-01 10:06:59 +08:00
hxwang
46037c2ccd [chore] minor fix after rebase 2024-08-01 10:06:59 +08:00
hxwang
803878b2fd [moe] full test for deepseek and mixtral (pp + sp to fix) 2024-08-01 10:06:59 +08:00
hxwang
7077d38d5a [moe] finalize test (no pp) 2024-08-01 10:06:59 +08:00
haze188
2cddeac717 moe sp + ep bug fix 2024-08-01 10:06:59 +08:00
hxwang
877d94bb8c [moe] init moe plugin comm setting with sp 2024-08-01 10:06:59 +08:00
hxwang
09d6280d3e [chore] minor fix 2024-08-01 10:06:59 +08:00
Haze188
404b16faf3 [Feature] MoE Ulysses Support (#5918)
* moe sp support

* moe sp bug solve

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-08-01 10:06:59 +08:00