torchrl.trainers.algorithms.configs.utils.AdamConfig¶ class torchrl.trainers.algorithms.configs.utils.AdamConfig(lr: float = 0.001, betas: tuple[float, float] = (0.9, 0.999), eps: float = 0.0001, weight_decay: float = 0.0, amsgrad: bool = False, _target_: str = 'torch.optim.Adam', _partial_: bool = True)[源代码]¶ Adam 优化器的配置。