navigation links
activations module
Activation functions
Element-wise transformations without trainable parameters
-
class BiLU(alpha=1.0, delta=0.9)[source]
Bases: torch.nn.modules.module.Module
Bijective linear unit
f(x) = alpha + delta*sign(x)
Initializes internal Module state, shared by both nn.Module and ScriptModule.
-
forward(input)[source]
Output of the BiLU activation function
-
training: bool[source]
-
class NormBiTanh(alpha=0.3)[source]
Bases: torch.nn.modules.module.Module
Bijective Normalized Tanh layer
f(x) = alpha*Tanh + (1-alpha)*x activation
Initializes internal Module state, shared by both nn.Module and ScriptModule.
-
forward(input)[source]
Output of the NormBiTanh activation function
-
training: bool[source]