Empirical Evaluation of Rectified Activations in Convolutional Network
.
The function is defined as:
$$
\mbox{RReLU}(x) =
\left\{ \begin{array}{ll}
x & \mbox{if } x \geq 0 \\
ax & \mbox{ otherwise }
\end{array}
\right.
$$
where \(a\) is randomly sampled from uniform distribution
\(\mathcal{U}(\mbox{lower}, \mbox{upper})\).
See: https://arxiv.org/pdf/1505.00853.pdf