From 19d153057efebf65259fdc21f582ff2e55dbe2ec Mon Sep 17 00:00:00 2001 From: Hongxin Liu Date: Tue, 23 May 2023 17:16:10 +0800 Subject: [PATCH] [doc] add warning about fsdp plugin (#3813) --- docs/source/en/basics/booster_plugins.md | 1 + docs/source/zh-Hans/basics/booster_plugins.md | 1 + 2 files changed, 2 insertions(+) diff --git a/docs/source/en/basics/booster_plugins.md b/docs/source/en/basics/booster_plugins.md index 0362f095a..6ed49bfa7 100644 --- a/docs/source/en/basics/booster_plugins.md +++ b/docs/source/en/basics/booster_plugins.md @@ -62,6 +62,7 @@ More details can be found in [Pytorch Docs](https://pytorch.org/docs/main/genera ### Torch FSDP Plugin > ⚠ This plugin is not available when torch version is lower than 1.12.0. +> ⚠ This plugin does not support save/load sharded model checkpoint now. More details can be found in [Pytorch Docs](https://pytorch.org/docs/main/fsdp.html). diff --git a/docs/source/zh-Hans/basics/booster_plugins.md b/docs/source/zh-Hans/basics/booster_plugins.md index b15ceb1e3..00e7d91e3 100644 --- a/docs/source/zh-Hans/basics/booster_plugins.md +++ b/docs/source/zh-Hans/basics/booster_plugins.md @@ -62,6 +62,7 @@ Zero-2 不支持局部梯度累积。如果您坚持使用,虽然可以积累 ### Torch FSDP 插件 > ⚠ 如果 torch 版本低于 1.12.0,此插件将不可用。 +> ⚠ 该插件现在还不支持保存/加载分片的模型 checkpoint。 更多详细信息,请参阅 [Pytorch 文档](https://pytorch.org/docs/main/fsdp.html).