1
0
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2025-08-26 19:50:53 +00:00
Commit Graph

2 Commits

Author SHA1 Message Date
Wenhao Chen
724441279b
[moe]: fix ep/tp tests, add hierarchical all2all ()
* fix: add warning for EP different behavior

* fix: use shard_data in ep & tp model

* to: add used_capacity

* fix: fix router test

* feat: add create_ep_node_group

* feat: add create_ep_hierarchical_group fn

* feat: add HierarchicalAllToAll

* test: add hierarchical all2all test

* fix: fix test errors

* fix: simplify create_ep_hierarchical_group

* fix: add hierarchical_alltoall arg

* fix: fix environ typo

* revert: revert process mesh order

* to: add todo mark

* fix: skip hierarchical_comm if torch < 1.13.1
2023-11-09 06:31:00 +00:00
Xuanlei Zhao
dc003c304c
[moe] merge moe into main ()
* update moe module
* support openmoe
2023-11-02 02:21:24 +00:00