ReLU (Rectifier Linear Unit)

A neuron that uses the rectifier function as activation function. The rectifier function is defined as:

$$ f(x) = max(0,x)

$$

Softplus

The softplus function is a smooth approximation to the rectifier funcion:

$$ f(x) = ln(1+ e^x)

$$

Whose derivative is the logistic function:

$$ f^\prime(x) = \frac{e^x}{e^x + 1} = \frac{1}{1 + e^{-x}}

$$

Softmax

Also called normalized exponential function. It is a generalization of the logistic function over K dimensions. It converts a K-dimensional input vector $$z$$ of real values to a K-dimensional vector $$\sigma(z)$$ of real values in the interval (0, 1) whose sum is 1.

$$ \sigma(z)j = \frac{e^{z_j}}{\sum^K{k=1}{e^{z_j}}} \qquad \operatorname{for} \thinspace j = 1, \dots, K

$$

results matching ""

    No results matching ""