This website requires JavaScript.
Explore
Help
Register
Sign In
github
/
ColossalAI
Watch
1
Star
0
Fork
0
You've already forked ColossalAI
mirror of
https://github.com/hpcaitech/ColossalAI.git
synced
2025-09-02 17:46:42 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
feature/2-stage
ColossalAI
/
colossalai
/
testing
History
Frank Lee
58df720570
[shardformer] adapted T5 and LLaMa test to use kit (
#4049
)
...
* [shardformer] adapted T5 and LLaMa test to use kit * polish code
2023-07-04 16:05:01 +08:00
..
__init__.py
[shardformer] supported T5 and its variants (
#4045
)
2023-07-04 16:05:01 +08:00
comparison.py
[shardformer] adapted T5 and LLaMa test to use kit (
#4049
)
2023-07-04 16:05:01 +08:00
pytest_wrapper.py
[testing] move pytest to be inside the function (
#4087
)
2023-06-27 11:02:25 +08:00
random.py
[zero] test gradient accumulation (
#1964
)
2022-11-29 13:00:30 +08:00
utils.py
[booster] support torch fsdp plugin in booster (
#3697
)
2023-05-15 12:14:38 +08:00