[booster] add warning for torch fsdp plugin doc (#3833)

This commit is contained in:
wukong1992
2023-05-25 14:00:02 +08:00
committed by GitHub
parent 84500b7799
commit 3229f93e30
3 changed files with 12 additions and 1 deletions

View File

@@ -62,8 +62,11 @@ More details can be found in [Pytorch Docs](https://pytorch.org/docs/main/genera
### Torch FSDP Plugin
> ⚠ This plugin is not available when torch version is lower than 1.12.0.
> ⚠ This plugin does not support save/load sharded model checkpoint now.
> ⚠ This plugin does not support optimizer that use multi params group.
More details can be found in [Pytorch Docs](https://pytorch.org/docs/main/fsdp.html).
{{ autodoc:colossalai.booster.plugin.TorchFSDPPlugin }}

View File

@@ -62,8 +62,11 @@ Zero-2 不支持局部梯度累积。如果您坚持使用,虽然可以积累
### Torch FSDP 插件
> ⚠ 如果 torch 版本低于 1.12.0,此插件将不可用。
> ⚠ 该插件现在还不支持保存/加载分片的模型 checkpoint。
> ⚠ 该插件现在还不支持使用了multi params group的optimizer。
更多详细信息,请参阅 [Pytorch 文档](https://pytorch.org/docs/main/fsdp.html).
{{ autodoc:colossalai.booster.plugin.TorchFSDPPlugin }}