polish optimizer docstring (#619)

This commit is contained in:
ver217
2022-04-01 16:27:03 +08:00
committed by GitHub
parent 8432dc7080
commit e619a651fb
5 changed files with 65 additions and 82 deletions

View File

@@ -44,8 +44,8 @@ class CPUAdam(torch.optim.Optimizer):
True for decoupled weight decay(also known as AdamW) (default: True)
simd_log (boolean, optional): whether to show if you are using SIMD to
accelerate. (default: False)
.. _Adam: A Method for Stochastic Optimization:
.. _Adam\: A Method for Stochastic Optimization:
https://arxiv.org/abs/1412.6980
.. _On the Convergence of Adam and Beyond:
https://openreview.net/forum?id=ryQu7f-RZ