mirror of
https://github.com/hpcaitech/ColossalAI.git
synced 2025-05-31 03:15:40 +00:00
[doc] add warning about fsdp plugin (#3813)
This commit is contained in:
parent
6b305a99d6
commit
19d153057e
@ -62,6 +62,7 @@ More details can be found in [Pytorch Docs](https://pytorch.org/docs/main/genera
|
|||||||
### Torch FSDP Plugin
|
### Torch FSDP Plugin
|
||||||
|
|
||||||
> ⚠ This plugin is not available when torch version is lower than 1.12.0.
|
> ⚠ This plugin is not available when torch version is lower than 1.12.0.
|
||||||
|
> ⚠ This plugin does not support save/load sharded model checkpoint now.
|
||||||
|
|
||||||
More details can be found in [Pytorch Docs](https://pytorch.org/docs/main/fsdp.html).
|
More details can be found in [Pytorch Docs](https://pytorch.org/docs/main/fsdp.html).
|
||||||
|
|
||||||
|
@ -62,6 +62,7 @@ Zero-2 不支持局部梯度累积。如果您坚持使用,虽然可以积累
|
|||||||
### Torch FSDP 插件
|
### Torch FSDP 插件
|
||||||
|
|
||||||
> ⚠ 如果 torch 版本低于 1.12.0,此插件将不可用。
|
> ⚠ 如果 torch 版本低于 1.12.0,此插件将不可用。
|
||||||
|
> ⚠ 该插件现在还不支持保存/加载分片的模型 checkpoint。
|
||||||
|
|
||||||
更多详细信息,请参阅 [Pytorch 文档](https://pytorch.org/docs/main/fsdp.html).
|
更多详细信息,请参阅 [Pytorch 文档](https://pytorch.org/docs/main/fsdp.html).
|
||||||
|
|
||||||
|
Loading…
Reference in New Issue
Block a user