-
-
Notifications
You must be signed in to change notification settings - Fork 23
Activation functions
Activation functions written below are provided as default, see how to add more
Sigmoid is the default activation function. This function outputs a value in the range [0,1].
Definition:
reLU activation function. This activation function is easy to compute since it does not require heavy calculations. This function outputs values in a range [0,infinity].
Definition:
Similar to reLU, this activation function is easy to compute. It also allows the output to be negative which solves the "dying reLU neuron" problem. This function outputs values in a range [-infinity,infinity].
Definition:
This activation function shares a lot of similarities with sigmoid. Unlike sigmoid, outputs a value in the range [-1,1].
Definition:
This activation function is a sigmoid's output multiplied by . SiLU shares a lot of similarities with leakyReLU, exept the function does not output negative for the entirety of the negative domain.
Definition:
This is an experimental function, it is very simiar to arctan(x). Unlike , this function outputs a value in the range [~ 0,~ 1].
Definition: