[optimizer] add div_scale for optimizers (#2117)

* [optimizer] add div_scale for optimizers

* [zero] use div_scale in zero optimizer

* fix testing error
This commit is contained in:
HELSON
2022-12-12 17:58:57 +08:00
committed by GitHub
parent e5aa8333e4
commit e7d3afc9cc
8 changed files with 41 additions and 32 deletions

View File

@@ -71,7 +71,7 @@ def test_adam(adamw, step, p_dtype, g_dtype):
weight_decay = 0
multi_tensor_applier(fused_adam, dummy_overflow_buf, [[g], [p], [m], [v]], lr, beta1, beta2, eps, step, adamw,
True, weight_decay)
True, weight_decay, -1)
torch_adam_update(
step,