Mark As Completed Discussion

SoftMax

SoftMax is a very different kind of activation function mostly used for multiclass classification. It takes all the outputs and then converts them to probability. The output of the softmax function will always add up to 1. And all the values of the output will be always non-negative.

f(xi)=exij=1Jexj

As this is not a continuous function and depends on the individual value of x, it is quite difficult to plot on a 2D surface.

PYTHON
1class SoftMaxActivation():
2    def forward(X):
3        return np.e**x / np.sum(np.e**x)