Commit Graph

3873 Commits

Author SHA1 Message Date
Tong Li
704866a240 detach 2025-03-11 16:17:02 +08:00
Tong Li
47d6493778 add response length 2025-03-11 13:06:09 +08:00
Tong Li
abca66e69f fix reward score 2025-03-11 10:17:32 +08:00
Tong Li
71a0181fce update reward 2025-03-10 14:19:10 +08:00
Tong Li
754b16dfbf update reward fn 2025-03-10 14:18:22 +08:00
Tong Li
9d9d51614e update grpo 2025-03-10 14:12:04 +08:00
pre-commit-ci[bot]
eb6337f07f [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2025-03-06 08:29:59 +00:00
Tong Li
22cc1558a8 Merge branch 'grpo-latest' of github.com:hpcaitech/ColossalAI into grpo-latest 2025-03-06 16:28:47 +08:00
Tong Li
0590f10fb7 update select algo 2025-03-06 16:27:13 +08:00
Tong Li
0cc0c843ed add save 2025-03-06 16:26:14 +08:00
pre-commit-ci[bot]
ab5b6d8432 [pre-commit.ci] auto fixes from pre-commit.com hooks
for more information, see https://pre-commit.ci
2025-03-06 06:30:27 +00:00
Tong Li
0f566cc2d4 add algo selection 2025-03-06 14:29:22 +08:00
Tong Li
812f4b7750 update loader 2025-03-06 11:44:42 +08:00
Tong Li
7f2ceac5c3 update example 2025-03-06 10:54:23 +08:00
Tong Li
d03cdea949 update reward fn 2025-03-06 10:53:48 +08:00
Tong Li
678f5a9eca update loss 2025-03-06 10:53:03 +08:00
Tong Li
b96d69055e grpo consumer 2025-03-06 10:51:27 +08:00
Tong Li
c15225bc52 modify data loader 2025-03-06 10:49:44 +08:00
Tong Li
070907dd7f polish 2025-02-28 10:16:42 +08:00
Tong Li
f736d747e3 update grpo 2025-02-25 18:12:04 +08:00
Tong Li
ffd3878a1e add simple grpo 2025-02-23 22:54:26 +08:00
Tong Li
8e6c9a4ab3 add reward related function 2025-02-23 11:02:54 +08:00
Hongxin Liu
de282dd694
[feature] fit RL style generation (#6213)
* [feature] fit rl style generation

* [doc] add docstr

* [doc] add docstr
2025-02-21 17:28:19 +08:00
Hongxin Liu
43c9b5fb44
[chat] add distributed impl (#6210) 2025-02-21 15:24:23 +08:00
Hongxin Liu
9379cbd668
[release] update version (#6195)
* [release] update version

* fix test

* fix test
2025-02-20 11:36:18 +08:00
binmakeswell
24dee8f0b7
[doc] DeepSeek V3/R1 news (#6199)
* [doc] DeepSeek V3/R1 news

* [doc] DeepSeek V3/R1 news

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-02-19 15:07:29 +08:00
Hongxin Liu
f73ae55394
[application] add lora sft example data (#6198) 2025-02-18 20:18:18 +08:00
Tong Li
f8b9e88484
[application] Update README (#6196)
* remove unused ray

* remove unused readme

* update readme

* update readme

* update

* update

* add link

* update readme

* update readme

* fix link

* update code

* update cititaion

* update

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* update readme

* update project

* add images

* update link

* update note

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-02-18 20:17:56 +08:00
Hongxin Liu
d54642a263
[application] add lora sft example (#6192)
* [application] add lora sft example

* update requirements

* update readme

* update comment

* update ci
2025-02-18 13:06:38 +08:00
YeAnbang
d20c8ffd97
Add GRPO and Support RLVR for PPO (#6186)
* add grpo, support rlvr

* add grpo, support rlvr

* tested deepseek r1 pipeline

* add ci

* verify grpo r1

* verify grpo r1

* update readme, remove unused code

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* remove path

* clean code

* fix circular import

* fix ci OOM

* fix ci OOM

* skip kto tp, fix qwen generation

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-02-18 09:43:36 +08:00
flybird11111
ce0ec40811
[checkpointio] fix for async io (#6189) 2025-02-14 17:34:13 +08:00
Hongxin Liu
5ff5323538
[hotfix] fix zero optim save (#6191) 2025-02-14 15:09:50 +08:00
Hongxin Liu
014837e725
[shardformer] support pipeline for deepseek v3 and optimize lora save (#6188)
* [shardformer] support pipeline for deepseek v3

* [checkpointio] fix lora save

* [devops] update ci env

* [booster] optimize lora

* fix test

* fix test
2025-02-14 14:48:54 +08:00
Wenxuan Tan
ec73f1b5e2
[CI] Cleanup Dist Optim tests with shared helper funcs (#6125)
* Refractor and cleanup using common helper funcs. Tests passed

* Update comments

* Fix relative import

* Fix param fetching bug
2025-02-12 13:42:34 +08:00
flybird11111
5c09d726a6
[checkpointio] fix checkpoint for 3d (#6187)
* fix checkpoint io for 3d

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update hybrid_parallel_checkpoint_io.py

* fix

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-02-12 11:54:55 +08:00
Hongxin Liu
2b415e5999
[shardformer] support ep for deepseek v3 (#6185)
* [feature] support ep for deepseek v3

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix test

* [shardformer] fix deepseek v3 init

* [lazy] fit lora for lazy init

* [example] support npu for deepseek v3

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-02-11 16:10:25 +08:00
flybird11111
17062c83b9
[hotfix] fix hybrid checkpointio for sp+dp (#6184)
* Update hybrid_parallel_plugin.py

* Update hybrid_parallel_plugin.py

* Update hybrid_parallel_plugin.py

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update build_on_pr.yml

* Update test_zerobubble_pp.py

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-02-06 17:21:04 +08:00
Wenxuan Tan
ca0aa2365d
[Issue template] Add checkbox asking for details to reproduce error (#6104)
* Add checkbox asking about reproducing error

* update

* Update

* Update checkbox
2025-01-24 14:36:25 +08:00
Lemon Qin
97e60cbbcb
[checkpointio] gather tensor before unpad it if the tensor is both padded and distributed (#6168) 2025-01-21 10:23:15 +08:00
Guangyao Zhang
5b094a836b
[Inference]Fix example in readme (#6178) 2025-01-08 11:51:50 +08:00
Hongxin Liu
ee81366cac
[checkpointio] support load-pin overlap (#6177)
* [checkpointio] support load-pin overlap

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [test] add conftest

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-01-07 16:16:04 +08:00
Hongxin Liu
479067e9bc
[release] update version (#6174)
* [release] update version

* [devops] fix test pypi ci

* [devops] fix test pypi ci
2025-01-03 11:52:23 +08:00
pre-commit-ci[bot]
7fdef9fd6b
[pre-commit.ci] pre-commit autoupdate (#6113)
updates:
- [github.com/pre-commit/mirrors-clang-format: v19.1.2 → v19.1.5](https://github.com/pre-commit/mirrors-clang-format/compare/v19.1.2...v19.1.5)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-01-02 10:23:20 +08:00
duanjunwen
a9bedc7a43
[Sharderformer] Support zbv in Sharderformer Policy (#6150)
* [feat] Sharderformer support zbv

* [feat] support chatglm2, command, deepseek for zbv

* [feat] support zbv in shardformer policy:
falcon,gptj,mistral,opt,qwen2,t5, vit, whisper

* [feat] support GPT2FusedLinearConv1D

* [feat] support GPT2FusedLinear (without tp)

* [fix] debug FusedConvLinear

* [shardfromer] support gpt2 policy for zbv, support GPT2FusedLinearConv
Col and Row.

* [Shardformer] support FusedLinear1D base for zbv

* [shardformer] support zbv in FusedLinear1D base, Col, Row

* [shardformer] support zbv in blip2 and sam policy

* [shardformer] fix bug incorrect number of gradients; add fusedLinear
base testcase;

* [fix] fix incorrect number of gradients ;

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* [Shardformer] add en doc for zbv;

* [fix] fix typo in Model compatibility table

* [fix] fix API Reference typo

* [Shardformer] add zh-Han doc for zbv

* [fix] fix Linear name; update en & zh doc

* [fix] fix shardformer doc import err

* [fix] fix shardconfig import in doc

* [fix] fix shardformer doc

* [fix] fix shardconfig doc

* [fix] fix config

* [fix] remove shardconfig

* [fix] fix doc

* [feat] add zbv doc string

* [fix] rm doc

* [fix] fix doc

* [fix] empty zbv doc

* [fix] ifx torch version

* [fix] fix torch version

* [fix] fix torch versions

* [fix] fix torch versions

* [fix] fix pyramid versions

* [fix] fix pyramid, zope version

* [fix] try fix workflow

* [fix] try import ShardConfig in yml

* [fix] fix workflow

* [fix] fix workflow

* [fix] fix workflow

* [fix] fix workflow

* [fix] fix ci

* [fix] fix zbv doc

* [fix] fix param for qkv linear, gpt2fused linear; fix requirments;

* [fix] fix policy use fused_linear

* [fix] fix weight grad none, err caused by  weight ptr change

* [fix] fix comm in WeightGradStore

* [fix] fix WeightGradStore pop param

* [fix] remove useless param in doc; fix gpt2 qkv test;

* [shardformer] simplify execute_w_pass_grad_accum;

* [fix] rm useless comments

* [shardformer] simplify execute_w_pass_grad_accum & execute_w_pass

* [shardformer] Run meaningful doc test

* [shadformer] fix doc test cmd;

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2025-01-02 10:22:26 +08:00
Hongxin Liu
af06d162cf
[checkpointio] support non blocking pin load (#6172)
* [checkpointio] support non blocking pin load

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-12-25 17:03:25 +08:00
binmakeswell
836992438f
[news] release colossalai for sora (#6166)
* [news] release colossalai for sora

* [news] release colossalai for sora

* [news] release colossalai for sora

* [news] release colossalai for sora
2024-12-23 21:59:39 +08:00
Hongxin Liu
8b0ed61490
[hotfix] improve compatibility (#6165) 2024-12-23 18:57:08 +08:00
binmakeswell
5f82bfa636
[doc] add bonus event (#6164)
* [doc] add bonus event

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-12-23 17:41:59 +08:00
duanjunwen
fa9d0318e4
[Hotfix] hotfix normalization (#6163)
* [fix] hotfix normalization

* [hotfix] force doc ci test

* [hotfix] fallback doc
2024-12-23 16:29:48 +08:00
flybird11111
130229fdcb
[checkpointio]support asyncio for 3d (#6152)
* fix

* fix

* fix

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update utils.py

* fix

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
2024-12-23 10:24:22 +08:00