Commit Graph

9 Commits

Author SHA1 Message Date
ver217
ce5a7dcab0 [zero] Update sharded model v2 using sharded param v2 (#323) 2022-03-08 18:18:06 +08:00
jiaruifang
4d07bffd77 using pytest parametrize 2022-03-08 15:10:21 +08:00
Jiarui Fang
cec05b25c9 [zero] update zero context init with the updated test utils (#327) 2022-03-08 14:45:01 +08:00
Jiarui Fang
29521cba0a [zero] yet an improved sharded param (#311) 2022-03-04 15:49:23 +08:00
Jiarui Fang
2f6295bf78 [zero] polish shard strategy (#310)
* init shard param from shape tuple

* add more unitest for shard param

* add set_payload method for ShardedParam

* [zero] add shareded tensor class

* polish code

* add shard stratgy

* move shard and gather logic to shard strategy from shard tensor.

* polish code
2022-03-04 15:35:07 +08:00
Jiarui Fang
88496b5b31 [zero] a shard strategy in granularity of tensor (#307) 2022-03-04 11:59:35 +08:00
Jiarui Fang
408cba655b [zero] sharded tensor (#305)
* init shard param from shape tuple

* add more unitest for shard param

* add set_payload method for ShardedParam

* [zero] add shareded tensor class

* polish code
2022-03-04 10:46:13 +08:00
Jiarui Fang
6c78946fdd Polish sharded parameter (#297)
* init shard param from shape tuple

* add more unitest for shard param

* add more unittests to shareded param
2022-03-03 12:42:57 +08:00
Jiarui Fang
3280869358 Feature/zero (#279)
* add zero1 (#209)

* add zero1

* add test zero1

* update zero stage 1 develop (#212)

* Implement naive zero3 (#240)

* naive zero3 works well

* add zero3 param manager

* add TODOs in comments

* add gather full param ctx

* fix sub module streams

* add offload

* fix bugs of hook and add unit tests

* fix bugs of hook and add unit tests (#252)

* add gather full param ctx

* fix sub module streams

* add offload

* fix bugs of hook and add unit tests

* polish code and add state dict hook

* fix bug

* update unit test

* refactor reconstructed zero code

* clip_grad support zero3 and add unit test

* add unit test for Zero3ParameterManager

* [WIP] initialize the shard param class

* [WIP] Yet another sharded model implementation (#274)

* [WIP] initialize the shard param class

* [WIP] Yes another implementation of shardModel. Using a better hook method.

* torch.concat -> torch.cat

* fix test_zero_level_1.py::test_zero_level_1 unitest

* remove deepspeed implementation and refactor for the reconstructed zero module

* polish zero dp unittests

Co-authored-by: ver217 <lhx0217@gmail.com>
Co-authored-by: Frank Lee <somerlee.9@gmail.com>
2022-03-01 18:17:01 +08:00