Hyperparameters vs. Parameters

by keshav


hyperparameters vs parameters.PNG

What are Hyperparameters?

A hyperparameter is an entity of a learning algorithm, usually (but not always) having a finite numerical value. That value affects the way the algorithm works. Hyperparameters are not learned by the algorithm itself from data but we set them. They have to be set by the data analyst before running the algorithm.

For example, the value of ‘K’ in K-NN algorithm, number of hidden neurons in the hidden layer of the neural network, filter or kernel size in the Convolutional neural network, etc.

Also Read:

What are the parameters?

Parameters are variables that define the model, learned by the learning algorithm. Parameters are directly modified by the learning algorithm based on the training data. The goal of learning is to find such values of parameters that make the model optimal in a certain sense.

For example, weights ‘w’ and bias ‘b’ in linear regression and neural networks.


No Comments


Post a Comment