Learn R Programming

tfaddons (version 0.10.0)

layer_activation_gelu: Gaussian Error Linear Unit

Description

Gaussian Error Linear Unit

Usage

layer_activation_gelu(object, approximate = TRUE, ...)

Arguments

object

Model or layer object

approximate

(bool) Whether to apply approximation

...

additional parameters to pass

Value

A tensor

Details

A smoother version of ReLU generally used in the BERT or BERT architecture based models. Original paper: https://arxiv.org/abs/1606.08415