ColossalAI/colossalai/booster/plugin
YeAnbang ed43a4be04
[Distributed RLHF] Integration of PP (#6257)
* update help information

* update style

* fix

* minor fix

* support PP training

* add pp support

* remove unused code

* address conversation

---------

Co-authored-by: Tong Li <tong.li35271158@gmail.com>
2025-04-09 13:23:24 +08:00
..
__init__.py [shardformer] fix the moe (#5883) 2024-07-03 20:02:19 +08:00
dp_plugin_base.py [llama] fix dataloader for hybrid parallel (#5358) 2024-02-05 15:14:56 +08:00
gemini_plugin.py [checkpointio] support load-pin overlap (#6177) 2025-01-07 16:16:04 +08:00
hybrid_parallel_plugin.py [Distributed RLHF] Integration of PP (#6257) 2025-04-09 13:23:24 +08:00
low_level_zero_plugin.py [shardformer] support pipeline for deepseek v3 and optimize lora save (#6188) 2025-02-14 14:48:54 +08:00
moe_hybrid_parallel_plugin.py [shardformer] support ep for deepseek v3 (#6185) 2025-02-11 16:10:25 +08:00
plugin_base.py [lora] add lora APIs for booster, support lora for TorchDDP (#4981) 2024-04-28 10:51:27 +08:00
pp_plugin_base.py [misc] update pre-commit and run all files (#4752) 2023-09-19 14:20:26 +08:00
torch_ddp_plugin.py [shardformer] support pipeline for deepseek v3 and optimize lora save (#6188) 2025-02-14 14:48:54 +08:00
torch_fsdp_plugin.py [shardformer] support pipeline for deepseek v3 and optimize lora save (#6188) 2025-02-14 14:48:54 +08:00