API

trio.AdamParams

class AdamParams(BaseModel):
    learning_rate: float = 0.0001
    beta1: float = 0.9
    beta2: float = 0.95
    eps: float = 1e-12
    weight_decay: float = 0.0

AdamParams 是 Adam 优化器的参数配置,传入 TrainingClient.optim_step()

training_client.optim_step(AdamParams(learning_rate=1e-4)).result()

参数

参数类型默认值说明
learning_ratefloat1e-4学习率
beta1float0.9一阶矩估计的衰减系数
beta2float0.95二阶矩估计的衰减系数
epsfloat1e-12数值稳定项,防止除零
weight_decayfloat0.0权重衰减系数(L2 正则化)

On this page