Alpha Dropout is a dropout that keeps mean and variance of inputs to their original values, in order to ensure the self-normalizing property even after this dropout.
layer_alpha_dropout(object, rate, noise_shape = NULL, seed = NULL, ...)
What to compose the new Layer
instance with. Typically a
Sequential model or a Tensor (e.g., as returned by layer_input()
).
The return value depends on object
. If object
is:
missing or NULL
, the Layer
instance is returned.
a Sequential
model, the model with an additional layer is returned.
a Tensor, the output tensor from layer_instance(object)
is returned.
float, drop probability (as with layer_dropout()
). The
multiplicative noise will have standard deviation sqrt(rate / (1 - rate))
.
Noise shape
An integer to use as random seed.
standard layer arguments.
Arbitrary. Use the keyword argument input_shape
(list
of integers, does not include the samples axis) when using this layer as
the first layer in a model.
Same shape as input.
Alpha Dropout fits well to Scaled Exponential Linear Units by randomly setting activations to the negative saturation value.
https://www.tensorflow.org/api_docs/python/tf/keras/layers/AlphaDropout
Other noise layers:
layer_gaussian_dropout()
,
layer_gaussian_noise()