This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-01-29 21:49:54 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
7fa6be49d2ac1eae2eda60f150597f0d3998ddf7
ColossalAI
/
colossalai
/
auto_parallel
/
tensor_shard
History
Boyuan Yao
0385b26ebf
[autoparallel] Patch meta information of
torch.nn.LayerNorm
(
#2647
)
...
* [autoparallel] layernorm metainfo patch * [autoparallel] polish test
2023-02-10 14:29:24 +08:00
..
deprecated
Revert "[NFC] polish code format" (
#2372
)
2023-01-06 16:01:09 +08:00
node_handler
[autoparallel] Patch meta information of
torch.nn.LayerNorm
(
#2647
)
2023-02-10 14:29:24 +08:00
solver
[autoparallel] adapt autoparallel tests with latest api (
#2626
)
2023-02-08 15:02:12 +08:00
utils
Revert "[NFC] polish code format" (
#2372
)
2023-01-06 16:01:09 +08:00
__init__.py
[autoparallel] init new folder structure (
#1696
)
2022-10-13 14:18:55 +08:00
constants.py
[autoparallel] adapt solver with self attention (
#2037
)
2022-12-01 17:53:15 +08:00
initialize.py
add overlap option (
#2613
)
2023-02-08 15:02:31 +08:00
sharding_strategy.py
[autoparallel] memory estimation for shape consistency (
#2144
)
2022-12-21 10:39:37 +08:00