activation_hard_silu: Hard SiLU activation function, also known as Hard Swish.
Description
It is defined as:
It's a faster, piecewise linear approximation of the silu activation.
Usage
activation_hard_silu(x)activation_hard_swish(x)
Value
A tensor, the result from applying the activation to the input tensor x
.