Conditional Gradient
optimizer_conditional_gradient(
learning_rate,
lambda_,
epsilon = 1e-07,
use_locking = FALSE,
name = "ConditionalGradient",
clipnorm = NULL,
clipvalue = NULL,
decay = NULL,
lr = NULL
)
A Tensor or a floating point value, or a schedule that is a tf$keras$optimizers$schedules$LearningRateSchedule The learning rate.
A Tensor or a floating point value. The constraint.
A Tensor or a floating point value. A small constant for numerical stability when handling the case of norm of gradient to be zero.
If True, use locks for update operations.
Optional name prefix for the operations created when applying gradients. Defaults to 'ConditionalGradient'.
is clip gradients by norm.
is clip gradients by value.
is included for backward compatibility to allow time inverse decay of learning rate.
is included for backward compatibility, recommended to use learning_rate instead.
Optimizer for use with `keras::compile()`