1
0
mirror of https://github.com/hpcaitech/ColossalAI.git synced 2025-05-05 23:18:22 +00:00
ColossalAI/colossalai/zero/__init__.py
Wenhao Chen 7172459e74
[shardformer]: support gpt-j, falcon, Mistral and add interleaved pipeline for bert ()
* [shardformer] implement policy for all GPT-J models and test

* [shardformer] support interleaved pipeline parallel for bert finetune

* [shardformer] shardformer support falcon ()

* [shardformer]: fix interleaved pipeline for bert model ()

* [hotfix]: disable seq parallel for gptj and falcon, and polish code ()

* Add Mistral support for Shardformer ()

* [shardformer] add tests to mistral ()

---------

Co-authored-by: Pengtai Xu <henryxu880@gmail.com>
Co-authored-by: ppt0011 <143150326+ppt0011@users.noreply.github.com>
Co-authored-by: flybird11111 <1829166702@qq.com>
Co-authored-by: eric8607242 <e0928021388@gmail.com>
2023-11-28 16:54:42 +08:00

14 lines
390 B
Python

from .gemini import GeminiAdamOptimizer, GeminiDDP, GeminiOptimizer, get_static_torch_model
from .low_level import LowLevelZeroOptimizer
from .wrapper import zero_model_wrapper, zero_optim_wrapper
__all__ = [
"GeminiDDP",
"GeminiOptimizer",
"GeminiAdamOptimizer",
"zero_model_wrapper",
"zero_optim_wrapper",
"LowLevelZeroOptimizer",
"get_static_torch_model",
]