torch.optim.adam.adam#
- torch.optim.adam.adam(params, grads, exp_avgs, exp_avg_sqs, max_exp_avg_sqs, state_steps, foreach=None, capturable=False, differentiable=False, fused=None, grad_scale=None, found_inf=None, has_complex=False, decoupled_weight_decay=False, *, amsgrad, beta1, beta2, lr, weight_decay, eps, maximize)[源代码]#
执行 Adam 算法计算的函数式 API。
有关详细信息,请参阅
Adam。