Learn R Programming

tfaddons (version 0.10.0)

extend_with_decoupled_weight_decay: Factory function returning an optimizer class with decoupled weight decay

Description

Factory function returning an optimizer class with decoupled weight decay

Usage

extend_with_decoupled_weight_decay(base_optimizer)

Arguments

base_optimizer

An optimizer class that inherits from tf$optimizers$Optimizer.

Value

A new optimizer class that inherits from DecoupledWeightDecayExtension and base_optimizer.

Details

The API of the new optimizer class slightly differs from the API of the base optimizer:

- The first argument to the constructor is the weight decay rate. - minimize and apply_gradients accept the optional keyword argument decay_var_list, which specifies the variables that should be decayed. If NULLs, all variables that are optimized are decayed.

Examples

Run this code
# NOT RUN {
# }
# NOT RUN {
### MyAdamW is a new class
MyAdamW = extend_with_decoupled_weight_decay(tf$keras$optimizers$Adam)
### Create a MyAdamW object
optimizer = MyAdamW(weight_decay = 0.001, learning_rate = 0.001)
#### update var1, var2 but only decay var1
optimizer$minimize(loss, var_list = list(var1, var2), decay_variables = list(var1))

# }
# NOT RUN {
# }

Run the code above in your browser using DataLab