Mark As Completed Discussion

ReLU

Rectified Linear Unit Activation function has recently become popular for working very well with image datasets. It filters all the negative values to 0. For acts as a Linear Activation function for all the positive values.

f(x)=0 if x0 ; flinear(x) otherwise

ReLU

PYTHON
1def ReLUActivation():
2    def __init__(self, a):
3        self.a = a
4    def forward(X):
5        out = X.copy()
6        out[X <= 0] = 0
7        out *= self.a