The softmax function is the generalized form of the sigmoid function. Softmax Activation Function is the mathematical function that converts the vector of numbers into the vector of the probabilities. Softmax Activation Function is commonly used as an activation function in the case of multi-class classification problems in machine learning. This article contains the softmax activation function for example.
Sigmoid Activation Function is one of the widely used activation functions in deep learning. The sigmoid activation function has an S-shaped curve. This article contains about sigmoid activation function with python code."
The tanh function is similar to the sigmoid function. The shape of tanh activation function is S-shaped. This article contains about the tanh activation function with its derivative and python code. "
Relu activation function is one of the most used activation functions. This article contains all the basics about relu activation function with python code. you will find the python code for the relu activation function and its derivative."
Leaky ReLU is the improved version of the ReLU Function. It is the most common and effective method to solve a dying ReLU problem. It adds a slight slope in the negative range to prevent the dying ReLU issue.