What are the Advantages and Disadvantages of ReLU Activation Function ?

by keshav


ReLU activation Function and Derivative.png

The rectified linear activation function (RELU) is a piecewise linear function that, if the input is positive say x, the output will be x. otherwise, it outputs zero.

 

The mathematical representation of the ReLU function is,

ReLU

The derivative of ReLU is,

derivative ReLU

ReLU is used widely nowadays, but it has some problems. let's say if we have input less than 0, then it outputs zero, and the neural network can't continue the backpropagation algorithm. This problem is commonly known as Dying ReLU. To get rid of this problem we use an improvised version of ReLU, called Leaky ReLU.

 

Python Code

 

import numpy as np
import matplotlib.pyplot as plt

# Rectified Linear Unit (ReLU)
def ReLU(x):
  data = [max(0,value) for value in x]
  return np.array(data, dtype=float)

# Derivative for ReLU
def der_ReLU(x):
  data = [1 if value>0 else 0 for value in x]
  return np.array(data, dtype=float)

# Generating data for Graph
x_data = np.linspace(-10,10,100)
y_data = ReLU(x_data)
dy_data = der_ReLU(x_data)

# Graph
plt.plot(x_data, y_data, x_data, dy_data)
plt.title('ReLU Activation Function & Derivative')
plt.legend(['ReLU','der_ReLU'])
plt.grid()
plt.show()

 

ReLU activation Function and Derivative

Advantages:

  • Non-Linear
  • Computationally efficient

Disadvantages:

  • Dying ReLU problem i.e for the inputs 0 or negative the gradient of ReLu becomes zero and thus the network cannot make backpropagation.


No Comments


Post a Comment