Rdocumentation
powered by
Learn R Programming
deeplearning (version 0.1.0)
batch_normalization: Batch Normalization Function that normalizes the input before applying non-linearity
Description
This function normalizes the distribution of inputs to hidden layers in a neural network
Usage
batch_normalization(x, gamma, beta, mu = NULL, sigma_2 = NULL, epsilon = exp(-12))
Arguments
x
weighted sum of outputs from the previous layer
gamma
the gamma coefficient
beta
the beta coefficient
mu
the mean of the input neurons. If NULL, it will be caluclated in the function.
sigma_2
the variance of the input nerurons. If NULL, it will be calcualted in the function.
epsilon
a constant added to the variance for numerical stability
References
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift Sergey Ioffe, Christian Szegedy
See Also
http://jmlr.org/proceedings/papers/v37/ioffe15.pdf
Pg 4