Leaky ReLU
Leaky ReLU is similar to ReLU, but it acts like another activation function instead of 0 when X is negative.
PYTHON
1class LeakyReLUActivation():
2 def __init__(self,a,b):
3 self.a = a
4 self.b = b
5 def forward(X):
6 out = X.copy()
7 out[X <= 0] *= self.b
8 out[X > 0] *= self.a