powered by
Applies the Gaussian Error Linear Units function: $$\mbox{GELU}(x) = x * \Phi(x)$$
nn_gelu(approximate = "none")
the gelu approximation algorithm to use: 'none' or 'tanh'. Default: 'none'.
'none'
'tanh'
Input: \((N, *)\) where * means, any number of additional dimensions
*
Output: \((N, *)\), same shape as the input
where \(\Phi(x)\) is the Cumulative Distribution Function for Gaussian Distribution.
if (torch_is_installed()) { m <- nn_gelu() input <- torch_randn(2) output <- m(input) }
Run the code above in your browser using DataLab