ReLU
Rectified Linear Unit Activation function has recently become popular for working very well with image datasets. It filters all the negative values to 0. For acts as a Linear Activation function for all the positive values.
PYTHON
1def ReLUActivation():
2 def __init__(self, a):
3 self.a = a
4 def forward(X):
5 out = X.copy()
6 out[X <= 0] = 0
7 out *= self.a