Commit Graph

167 Commits

Author SHA1 Message Date
Jiarui Fang
d209aff684 Add FreqAwareEmbeddingBag (#1421) 2022-08-09 16:26:12 +08:00
Jiarui Fang
504419d261 [FAW] add cache manager for the cached embedding (#1419) 2022-08-09 15:17:17 +08:00
YuliangLiu0306
7c96055c68 [tensor]build sharding spec to replace distspec in future. (#1405) 2022-08-08 11:15:57 +08:00
HELSON
87775a0682 [colotensor] use cpu memory to store state_dict (#1367) 2022-07-26 14:13:38 +08:00
HELSON
4417804129 [unit test] add megatron init test in zero_optim (#1358) 2022-07-25 11:18:08 +08:00
HELSON
7a065dc9f6 [hotfix] fix megatron_init in test_gpt2.py (#1357) 2022-07-25 10:28:19 +08:00
HELSON
7a8702c06d [colotensor] add Tensor.view op and its unit test (#1343)
[colotensor] add megatron initialization for gpt2
2022-07-21 10:53:15 +08:00
HELSON
bf5066fba7 [refactor] refactor ColoTensor's unit tests (#1340) 2022-07-19 15:46:24 +08:00
ver217
0c51ff2c13 [hotfix] ZeroDDP use new process group (#1333)
* process group supports getting ranks in group

* chunk mgr receives a process group

* update unit test

* fix unit tests
2022-07-18 14:14:52 +08:00
HELSON
d49708ae43 [hotfix] fix ddp for unit test test_gpt2 (#1326) 2022-07-15 18:19:52 +08:00
HELSON
1b41686461 [hotfix] fix unit test test_module_spec (#1321) 2022-07-15 14:02:32 +08:00
Jiarui Fang
85f933b58b [Optimizer] Remove useless ColoOptimizer (#1312) 2022-07-14 16:57:48 +08:00
Jiarui Fang
9f10524313 [Optimizer] polish the init method of ColoOptimizer (#1310) 2022-07-14 16:37:33 +08:00
HELSON
36086927e1 [hotfix] fix ColoTensor GPT2 unitest (#1309) 2022-07-14 16:37:20 +08:00
HELSON
260a55804a [hotfix] fix shape error in backward when using ColoTensor (#1298) 2022-07-13 23:06:12 +08:00
Jiarui Fang
79fe7b027a [hotfix] test model unittest hotfix (#1281) 2022-07-12 23:45:29 +08:00
Jiarui Fang
e56731e916 [hotfix] test_gpt.py duplicated (#1279)
* make it faster

* [hotfix] torchvison fx tests

* [hotfix] rename duplicated named test_gpt.py
2022-07-12 23:29:17 +08:00
HELSON
abba4d84e1 [hotfix] fix bert model test in unitests (#1272) 2022-07-12 23:26:45 +08:00
Jiarui Fang
c92f84fcdb [tensor] distributed checkpointing for parameters (#1240) 2022-07-12 15:51:06 +08:00
Jiarui Fang
1aad903c15 [tensor] redistribute among different process groups (#1247)
* make it faster

* [tensor] rename convert_to_dist -> redistribute

* [tensor] ShardSpec and ReplicaSpec

* [tensor] redistribute among diff pgs

* polish code
2022-07-12 10:24:05 +08:00
Jiarui Fang
9bcd2fd4af [tensor] a shorter shard and replicate spec (#1245) 2022-07-11 15:51:48 +08:00
Jiarui Fang
2699dfbbfd [rename] convert_to_dist -> redistribute (#1243) 2022-07-11 13:05:44 +08:00
HELSON
f6add9b720 [tensor] redirect .data.__get__ to a tensor instance (#1239) 2022-07-11 11:41:29 +08:00
Jiarui Fang
4a76084dc9 [tensor] add zero_like colo op, important for Optimizer (#1236) 2022-07-08 14:55:27 +08:00
Jiarui Fang
3b500984b1 [tensor] fix some unittests (#1234) 2022-07-08 14:18:30 +08:00
HELSON
0453776def [tensor] fix a assertion in colo_tensor cross_entropy (#1232) 2022-07-08 11:18:00 +08:00
Jiarui Fang
0e199d71e8 [hotfix] fx get comm size bugs (#1233)
* init a checkpoint dir

* [checkpoint]support resume for cosinewarmuplr

* [checkpoint]add unit test

* fix some bugs but still not OK

* fix bugs

* make it faster

* [checkpoint]support generalized scheduler

* polish

* [tensor] torch function return colotensor

* polish

* fix bugs

* remove debug info

* polish

* polish

* [tensor] test_model pass unittests

* polish

* [hotfix] fx get comm size bug

Co-authored-by: ZhaoYi1222 <zhaoyi9499@gmail.com>
2022-07-08 10:54:41 +08:00
HELSON
42ab36b762 [tensor] add unitest for colo_tensor 1DTP cross_entropy (#1230) 2022-07-07 19:17:23 +08:00
Jiarui Fang
a98319f023 [tensor] torch function return colotensor (#1229) 2022-07-07 18:09:18 +08:00
Jiarui Fang
15d988f954 [tensor] sharded global process group (#1219) 2022-07-07 13:38:48 +08:00
Jiarui Fang
ae7d3f4927 [refactor] move process group from _DistSpec to ColoTensor. (#1203) 2022-07-06 16:15:16 +08:00
Jiarui Fang
060b917daf [refactor] remove gpc dependency in colotensor's _ops (#1189) 2022-07-04 18:54:37 +08:00
Jiarui Fang
c463f8adf9 [tensor] remove gpc in tensor tests (#1186) 2022-06-29 14:08:40 +08:00
Jiarui Fang
372f791444 [refactor] move chunk and chunkmgr to directory gemini (#1182) 2022-06-29 13:31:02 +08:00
Jiarui Fang
7487215b95 [ColoTensor] add independent process group (#1179) 2022-06-29 10:03:09 +08:00
Jiarui Fang
1b657f9ce1 [tensor] revert local view back (#1178) 2022-06-27 18:38:34 +08:00
Jiarui Fang
0dd4e2bbfb [Tensor] rename some APIs in TensorSpec and Polish view unittest (#1176) 2022-06-27 15:56:11 +08:00
Jiarui Fang
aa7bef73d4 [Tensor] distributed view supports inter-process hybrid parallel (#1169) 2022-06-27 09:45:26 +08:00
Jiarui Fang
4b9bba8116 [ColoTensor] rename APIs and add output_replicate to ComputeSpec (#1168) 2022-06-24 13:08:54 +08:00
Jiarui Fang
f4ef224358 [Tensor] remove ParallelAction, use ComputeSpec instread (#1166) 2022-06-23 17:34:59 +08:00
Jiarui Fang
177c374401 remove gather out in parallel action (#1163) 2022-06-23 16:35:05 +08:00
Jiarui Fang
07f9c781f9 [graph] improve the graph building. (#1157) 2022-06-22 16:47:20 +08:00
ver217
22717a856f [tensor] add embedding bag op (#1156) 2022-06-22 15:54:03 +08:00
ver217
ae86151968 [tensor] add more element-wise ops (#1155)
* add more element-wise ops

* update test_op

* polish unit test
2022-06-22 15:16:47 +08:00
ver217
ffa025e120 [tensor] dist spec s2s uses all-to-all (#1136)
* dist spec s2s uses all-to-all

* update unit test

* add sanity check

* polish unitest test with titans

* add sanity check for DistMgr

* add sanity check

Co-authored-by: jiaruifang <fangjiarui123@gmail.com>
2022-06-22 11:32:38 +08:00
Jiarui Fang
8cdce0399c [ColoTensor] improves init functions. (#1150) 2022-06-21 18:28:38 +08:00
ver217
8106d7b8c7 [ddp] refactor ColoDDP and ZeroDDP (#1146)
* ColoDDP supports overwriting default process group

* rename ColoDDPV2 to ZeroDDP

* add docstr for ZeroDDP

* polish docstr
2022-06-21 16:35:23 +08:00
ver217
789cad301b [hotfix] fix param op hook (#1131)
* fix param op hook

* update zero tp test

* fix bugs
2022-06-17 16:12:05 +08:00
ver217
7d14b473f0 [gemini] gemini mgr supports "cpu" placement policy (#1118)
* update gemini mgr

* update chunk

* add docstr

* polish placement policy

* update test chunk

* update test zero

* polish unit test

* remove useless unit test
2022-06-15 15:05:19 +08:00
ver217
1f894e033f [gemini] zero supports gemini (#1093)
* add placement policy

* add gemini mgr

* update mem stats collector

* update zero

* update zero optim

* fix bugs

* zero optim monitor os

* polish unit test

* polish unit test

* add assert
2022-06-10 14:48:28 +08:00