Learn R Programming

keras (version 2.2.4)

optimizer_adadelta: Adadelta optimizer.

Description

Adadelta optimizer as described in ADADELTA: An Adaptive Learning Rate Method.

Usage

optimizer_adadelta(lr = 1, rho = 0.95, epsilon = NULL, decay = 0,
  clipnorm = NULL, clipvalue = NULL)

Arguments

lr

float >= 0. Learning rate.

rho

float >= 0. Decay factor.

epsilon

float >= 0. Fuzz factor. If NULL, defaults to k_epsilon().

decay

float >= 0. Learning rate decay over each update.

clipnorm

Gradients will be clipped when their L2 norm exceeds this value.

clipvalue

Gradients will be clipped when their absolute value exceeds this value.

See Also

Other optimizers: optimizer_adagrad, optimizer_adamax, optimizer_adam, optimizer_nadam, optimizer_rmsprop, optimizer_sgd