Learn R Programming

torch (version 0.2.1)

optim_adadelta: Adadelta optimizer

Description

It has been proposed in ADADELTA: An Adaptive Learning Rate Method

Usage

optim_adadelta(params, lr = 1, rho = 0.9, eps = 1e-06, weight_decay = 0)

Arguments

params

(iterable): list of parameters to optimize or list defining parameter groups

lr

(float, optional): learning rate (default: 1e-3)

rho

(float, optional): coefficient used for computing a running average of squared gradients (default: 0.9)

eps

(float, optional): term added to the denominator to improve numerical stability (default: 1e-6)

weight_decay

(float, optional): weight decay (L2 penalty) (default: 0)

Examples

Run this code
# NOT RUN {
if (torch_is_installed()) {
# }
# NOT RUN {
optimizer <- optim_adadelta(model$parameters, lr = 0.1)
optimizer$zero_grad()
loss_fn(model(input), target)$backward()
optimizer$step()
# }
# NOT RUN {
}
# }

Run the code above in your browser using DataLab