Commit Graph

117 Commits

Author SHA1 Message Date
Jiarui Fang
2f6295bf78
[zero] polish shard strategy (#310)
* init shard param from shape tuple

* add more unitest for shard param

* add set_payload method for ShardedParam

* [zero] add shareded tensor class

* polish code

* add shard stratgy

* move shard and gather logic to shard strategy from shard tensor.

* polish code
2022-03-04 15:35:07 +08:00
ver217
b95f9b4670 polish code 2022-03-04 15:27:39 +08:00
ver217
2aa440358d fix sharded param hook and unit test 2022-03-04 15:27:39 +08:00
ver217
8c2327b93c impl shard optim v2 and add unit test 2022-03-04 15:27:39 +08:00
Jiarui Fang
88496b5b31
[zero] a shard strategy in granularity of tensor (#307) 2022-03-04 11:59:35 +08:00
Jiarui Fang
408cba655b
[zero] sharded tensor (#305)
* init shard param from shape tuple

* add more unitest for shard param

* add set_payload method for ShardedParam

* [zero] add shareded tensor class

* polish code
2022-03-04 10:46:13 +08:00
Jie Zhu
ce5d94a604
[profiler] primary memory tracer 2022-03-04 09:35:23 +08:00
FrankLeeeee
fac5d05a8d update unit testing CI rules 2022-03-03 17:45:09 +08:00
FrankLeeeee
0cd67a8dc0 added compatibility CI and options for release ci 2022-03-03 17:45:09 +08:00
FrankLeeeee
725d81ad21 added pypi publication CI and remove formatting CI 2022-03-03 17:45:09 +08:00
ver217
5cc84d94dc rename shared adam to sharded optim v2 2022-03-03 16:20:34 +08:00
ver217
df34bd0c7f fix master params dtype 2022-03-03 16:20:34 +08:00
ver217
6c290dbb08 add fp32 master params in sharded adam 2022-03-03 16:20:34 +08:00
ver217
6185b9772d add sharded adam 2022-03-03 16:20:34 +08:00
Jiarui Fang
de11a91007
polish license (#300)
* init shard param from shape tuple

* add more unitest for shard param
2022-03-03 14:11:45 +08:00
Jiarui Fang
6c78946fdd
Polish sharded parameter (#297)
* init shard param from shape tuple

* add more unitest for shard param

* add more unittests to shareded param
2022-03-03 12:42:57 +08:00
ver217
9b07ac81d4
[zero] add sharded grad and refactor grad hooks for ShardedModel (#287) 2022-03-02 18:28:29 +08:00
Frank Lee
4fbb8db586
fixed typo in ShardParam (#294) 2022-03-02 17:26:23 +08:00
Frank Lee
a463980aab
added unit test for sharded optimizer (#293)
* added unit test for sharded optimizer

* refactor for elegance
2022-03-02 17:15:54 +08:00
Frank Lee
193af3a8b7
added buffer sync to naive amp model wrapper (#291) 2022-03-02 16:47:17 +08:00
Jiarui Fang
6f22fb1906
add a common util for hooks registered on parameter. (#292) 2022-03-02 14:38:22 +08:00
Jie Zhu
3b64dcc439
bug fix: pass hook_list to engine (#273)
* bug fix: pass hook_list to engine

* change parameter name
2022-03-02 14:25:52 +08:00
Jiarui Fang
3280869358
Feature/zero (#279)
* add zero1 (#209)

* add zero1

* add test zero1

* update zero stage 1 develop (#212)

* Implement naive zero3 (#240)

* naive zero3 works well

* add zero3 param manager

* add TODOs in comments

* add gather full param ctx

* fix sub module streams

* add offload

* fix bugs of hook and add unit tests

* fix bugs of hook and add unit tests (#252)

* add gather full param ctx

* fix sub module streams

* add offload

* fix bugs of hook and add unit tests

* polish code and add state dict hook

* fix bug

* update unit test

* refactor reconstructed zero code

* clip_grad support zero3 and add unit test

* add unit test for Zero3ParameterManager

* [WIP] initialize the shard param class

* [WIP] Yet another sharded model implementation (#274)

* [WIP] initialize the shard param class

* [WIP] Yes another implementation of shardModel. Using a better hook method.

* torch.concat -> torch.cat

* fix test_zero_level_1.py::test_zero_level_1 unitest

* remove deepspeed implementation and refactor for the reconstructed zero module

* polish zero dp unittests

Co-authored-by: ver217 <lhx0217@gmail.com>
Co-authored-by: Frank Lee <somerlee.9@gmail.com>
2022-03-01 18:17:01 +08:00
binmakeswell
f84b15719f add community group and update issue template(#271) 2022-03-01 14:39:41 +08:00
Sze-qq
ec49c6833a update experimental visualization (#253) 2022-03-01 14:39:41 +08:00
binmakeswell
8836140725 add Chinese README 2022-03-01 14:39:41 +08:00
1SAA
adf7e50325 Added TPExpert for special situation 2022-02-28 13:55:35 +08:00
HELSON
b5b612a26c
Fixed parameter initialization in FFNExpert (#251) 2022-02-27 14:01:25 +08:00
アマデウス
726a4abb66
fixed CI dataset directory; fixed import error of 2.5d accuracy (#255) 2022-02-24 14:33:45 +08:00
1SAA
e7a1136436 Optimized MoE layer and fixed some bugs;
Decreased moe tests;

Added FFNExperts and ViTMoE model
2022-02-22 10:27:12 +08:00
zbian
218c061e2d fixed padding index issue for vocab parallel embedding layers; updated 3D linear to be compatible with examples in the tutorial 2022-02-22 02:05:43 +00:00
ver217
34d7c0401a update setup info (#233) 2022-02-22 02:05:43 +00:00
github-actions
b9f8521f8c Automated submodule synchronization 2022-02-15 11:35:37 +08:00
Frank Lee
f5ca88ec97 fixed apex import (#227) 2022-02-15 11:31:13 +08:00
Frank Lee
eb3fda4c28 updated readme and change log (#224) 2022-02-15 11:31:13 +08:00
ver217
578ea0583b update setup and workflow (#222) 2022-02-15 11:31:13 +08:00
Frank Lee
3a1a9820b0 fixed mkdir conflict and align yapf config with flake (#220) 2022-02-15 11:31:13 +08:00
Frank Lee
65e72983dc added flake8 config (#219) 2022-02-15 11:31:13 +08:00
アマデウス
9ee197d0e9 moved env variables to global variables; (#215)
added branch context;
added vocab parallel layers;
moved split_batch from load_batch to tensor parallel embedding layers;
updated gpt model;
updated unit test cases;
fixed few collective communicator bugs
2022-02-15 11:31:13 +08:00
Frank Lee
b82d60be02 updated github action for develop branch (#214) 2022-02-15 11:31:13 +08:00
BoxiangW
7d15ec7fe2
Update github actions (#205) 2022-02-04 15:04:55 +08:00
github-actions[bot]
5420809f43
Automated submodule synchronization (#203)
Co-authored-by: github-actions <github-actions@github.com>
2022-02-04 10:19:38 +08:00
Frank Lee
fd570ab285
add changelog and contributing doc (#202) 2022-02-03 15:38:00 +08:00
Frank Lee
02f13fa9d1
add code quality badge (#201) 2022-02-03 14:01:09 +08:00
Frank Lee
812357d63c
fixed utils docstring and add example to readme (#200) 2022-02-03 11:37:17 +08:00
Frank Lee
b9a761b9b8
added github action to synchronize submodule commits automatically (#193) 2022-02-03 11:23:45 +08:00
BoxiangW
a2f1565672
Update GitHub action and pre-commit settings (#196)
* Update GitHub action and pre-commit settings

* Update GitHub action and pre-commit settings (#198)
2022-01-28 16:59:53 +08:00
Frank Lee
765db512b5
fixed ddp bug on torch 1.8 (#194) 2022-01-28 15:14:04 +08:00
Jiarui Fang
569357fea0
add pytorch hooks (#179)
* add pytorch hooks
fix #175

* remove licenses in src code

* add gpu memory tracer

* replacing print with logger in ophooks.
2022-01-25 22:20:54 +08:00
ver217
708404d5f8
fix pipeline forward return tensors (#176) 2022-01-21 15:46:02 +08:00