This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2026-02-21 14:32:09 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
459a88c8063d8ef7c4cd720a4e9524adf5b5c367
ColossalAI
/
tests
/
test_shardformer
History
Hongxin Liu
1f5d2e8062
[hotfix] fix torch 2.0 compatibility (
#4936
)
...
* [hotfix] fix launch * [test] fix test gemini optim * [shardformer] fix vit
2023-10-18 11:05:25 +08:00
..
test_hybrid_parallel_grad_clip_norm
[feature] Add clip_grad_norm for hybrid_parallel_plugin (
#4837
)
2023-10-12 11:32:37 +08:00
test_layer
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00
test_model
[hotfix] fix torch 2.0 compatibility (
#4936
)
2023-10-18 11:05:25 +08:00
__init__.py
[shardformer] adapted T5 and LLaMa test to use kit (
#4049
)
2023-07-04 16:05:01 +08:00
test_shard_utils.py
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00
test_with_torch_ddp.py
[misc] update pre-commit and run all files (
#4752
)
2023-09-19 14:20:26 +08:00