Deep Learning

Softmax Activation Function with Python

softmax formula.PNG

The softmax function is the generalized form of the sigmoid function. Softmax Activation Function is the mathematical function that converts the vector of numbers into the vector of the probabilities. Softmax Activation Function is commonly used as an activation function in the case of multi-class classification problems in machine learning. This article contains the softmax activation function for example.

Read More

Sigmoid(Logistic) Activation Function ( with python code)

SigmoidFunction-new.png

Sigmoid Activation Function is one of the widely used activation functions in deep learning. The sigmoid activation function has an S-shaped curve. This article contains about sigmoid activation function with python code."

Read More

Hyperbolic Tangent (tanh) Activation Function [with python code]

tanh activation function-new.png

The tanh function is similar to the sigmoid function. The shape of tanh activation function is S-shaped. This article contains about the tanh activation function with its derivative and python code. "

Read More

ReLU Activation Function [with python code]

ReLU-activation-function-new.png

Relu activation function is one of the most used activation functions. This article contains all the basics about relu activation function with python code. you will find the python code for the relu activation function and its derivative."

Read More

Leaky ReLU Activation Function [with python code]

leaky-ReLU-activation-function-new.png

Leaky ReLU is the improved version of the ReLU Function. It is the most common and effective method to solve a dying ReLU problem. It adds a slight slope in the negative range to prevent the dying ReLU issue.

Read More

Clock