Commit Graph

2210 Commits

Author SHA1 Message Date
ver217
8d3250d74b [zero] ZeRO supports pipeline parallel (#477) 2022-03-21 16:55:37 +08:00
Sze-qq
7f5e4592eb Update Experiment result about Colossal-AI with ZeRO (#479)
* [readme] add experimental visualisation regarding ColossalAI with ZeRO (#476)

* Hotfix/readme (#478)

* add experimental visualisation regarding ColossalAI with ZeRO

* adjust newly-added figure size
2022-03-21 16:34:07 +08:00
Frank Lee
83a847d058 [test] added rerun on exception for testing (#475)
* [test] added rerun on exception function

* polish code
2022-03-21 15:51:57 +08:00
ver217
d70f43dd7a embedding remove attn mask (#474) 2022-03-21 14:53:23 +08:00
HELSON
7544347145 [MOE] add unitest for MOE experts layout, gradient handler and kernel (#469) 2022-03-21 13:35:04 +08:00
ver217
1559c0df41 fix attn mask shape of gpt (#472) 2022-03-21 12:01:31 +08:00
ver217
3cb3fc275e zero init ctx receives a dp process group (#471) 2022-03-21 11:18:55 +08:00
ver217
7e30068a22 [doc] update rst (#470)
* update rst

* remove empty rst
2022-03-21 10:52:45 +08:00
HELSON
aff9d354f7 [MOE] polish moe_env (#467) 2022-03-19 15:36:25 +08:00
HELSON
bccbc15861 [MOE] changed parallelmode to dist process group (#460) 2022-03-19 13:46:29 +08:00
Frank Lee
8f9617c313 [release] update version (#465) v0.1.0 2022-03-18 19:26:07 +08:00
Frank Lee
2963565ff8 [test] fixed release workflow step (#464) 2022-03-18 19:17:13 +08:00
Frank Lee
292590e0fa [test] fixed release workflow condition (#463) 2022-03-18 17:42:33 +08:00
Frank Lee
90bd97b9c0 [devops] fixed workflow bug (#462) 2022-03-18 17:26:24 +08:00
ver217
304263c2ce fix gpt attention mask (#461) 2022-03-18 17:24:19 +08:00
ver217
fc8e6db005 [doc] Update docstring for ZeRO (#459)
* polish sharded model docstr

* polish sharded optim docstr

* polish zero docstr

* polish shard strategy docstr
2022-03-18 16:48:20 +08:00
HELSON
84fd7c1d4d add moe context, moe utilities and refactor gradient handler (#455) 2022-03-18 16:38:32 +08:00
Frank Lee
af185b5519 [test] fixed amp convergence comparison test (#454) 2022-03-18 16:28:16 +08:00
ver217
a241f61b34 [zero] Update initialize for ZeRO (#458)
* polish code

* shard strategy receive pg in shard() / gather()

* update zero engine

* polish code
2022-03-18 16:18:31 +08:00
ver217
642846d6f9 update sharded optim and fix zero init ctx (#457) 2022-03-18 15:44:47 +08:00
Jiarui Fang
e2e9f82588 Revert "[zero] update sharded optim and fix zero init ctx" (#456)
* Revert "polish code"

This reverts commit 8cf7ff08cf.

* Revert "rename variables"

This reverts commit e99af94ab8.

* Revert "remove surplus imports"

This reverts commit 46add4a5c5.

* Revert "update sharded optim and fix zero init ctx"

This reverts commit 57567ee768.
2022-03-18 15:22:43 +08:00
ver217
8cf7ff08cf polish code 2022-03-18 14:25:25 +08:00
ver217
e99af94ab8 rename variables 2022-03-18 14:25:25 +08:00
ver217
46add4a5c5 remove surplus imports 2022-03-18 14:25:25 +08:00
ver217
57567ee768 update sharded optim and fix zero init ctx 2022-03-18 14:25:25 +08:00
Frank Lee
f27d801a13 [test] optimized zero data parallel test (#452) 2022-03-18 11:35:54 +08:00
github-actions[bot]
cfcc8271f3 [Bot] Automated submodule synchronization (#451)
Co-authored-by: github-actions <github-actions@github.com>
2022-03-18 09:51:43 +08:00
Frank Lee
ac4513c56e [DevOps] remove unneeded dependency in build workflow (#449) 2022-03-17 17:29:02 +08:00
Jiarui Fang
0fcfb1e00d [test] make zero engine test really work (#447) 2022-03-17 17:24:25 +08:00
Frank Lee
bb2790cf0b optimize engine and trainer test (#448) 2022-03-17 15:44:17 +08:00
Jiarui Fang
237d08e7ee [zero] hybrid cpu adam (#445) 2022-03-17 15:05:41 +08:00
Frank Lee
b72b8445c6 optimized context test time consumption (#446) 2022-03-17 14:40:52 +08:00
Jiarui Fang
496cbb0760 [hotfix] fix initialize bug with zero (#442) 2022-03-17 13:16:22 +08:00
Frank Lee
725a39f4bd update github CI with the current workflow (#441) 2022-03-17 10:38:04 +08:00
Frank Lee
5a1e33b97f update contributing.md with the current workflow (#440) 2022-03-17 10:28:04 +08:00
Jiarui Fang
17b8274f8a [unitest] polish zero config in unittest (#438) 2022-03-17 10:20:53 +08:00
Jiarui Fang
640a6cd304 [refactory] refactory the initialize method for new zero design (#431) 2022-03-16 19:29:37 +08:00
Frank Lee
4f85b687cf [misc] replace codebeat with codefactor on readme (#436) 2022-03-16 17:43:52 +08:00
Frank Lee
bffd85bf34 added testing module (#435) 2022-03-16 17:20:05 +08:00
HELSON
dbdc9a7783 added Multiply Jitter and capacity factor eval for MOE (#434) 2022-03-16 16:47:44 +08:00
Frank Lee
b03b3ae99c fixed mem monitor device (#433)
fixed mem monitor device
2022-03-16 15:25:02 +08:00
Frank Lee
14a7094243 fixed fp16 optimizer none grad bug (#432) 2022-03-16 14:35:46 +08:00
ver217
fce9432f08 sync before creating empty grad 2022-03-16 14:24:09 +08:00
ver217
ea6905a898 free param.grad 2022-03-16 14:24:09 +08:00
ver217
9506a8beb2 use double buffer to handle grad 2022-03-16 14:24:09 +08:00
Frank Lee
0f5f5dd556 fixed gpt attention mask in pipeline (#430) 2022-03-16 14:23:43 +08:00
Jiarui Fang
f9c762df85 [test] merge zero optim tests (#428) 2022-03-16 12:22:45 +08:00
Frank Lee
f0d6e2208b [polish] add license meta to setup.py (#427) 2022-03-16 12:05:56 +08:00
Jiarui Fang
5d7dc3525b [hotfix] run cpu adam unittest in pytest (#424) 2022-03-16 10:39:55 +08:00
Jiarui Fang
54229cd33e [log] better logging display with rich (#426)
* better logger using rich

* remove deepspeed in zero requirements
2022-03-16 09:51:15 +08:00