Epoch vs Batch Size vs Iterations

by keshav


cover_epochvs.bmp

Epoch, batch size, and iteration all are the terms that are related to the data sets used during the training of a machine learning model. Let’s understand what these terms mean. For this let’s consider a training set having 10, 00, 000 data examples in it.

 

Epochs

It is said to be one Epoch when an entire dataset is passed forward and backward propagation through a neural network only once.

During the training of a neural network, we use more than one epoch. It is done so because the optimization algorithms that we use more commonly are iterative ones (e.g. Gradient descent) and so will perform better optimization if we pass the dataset multiple times (i.e. learning of model will improve).

In practice, the size of the dataset is very huge (as we have considered above) passing which to the optimization algorithms will be computationally expensive so instead, we divide the entire dataset into smaller batches.

 

Batch and Batch size

As we can see in the above example the size of data is 10, 00, 000 which is quite large. So during the training of a machine learning model, we usually don’t pass this much amount of data at once to the optimization algorithm as it may be computationally expensive. Instead what we do is we divide the given training data into smaller chunks of data i.e. batches, which will be then passed to the optimization algorithm.

Therefore we can say that the batches are the subsets/parts of a training set and the total number of data examples in one of that single batch is called batch size.

Let’s understand this with an example.

Total training examples = 10, 00, 000

Let batch size = 10,000

The total number of batches we can have = 1000000/10000 = 100

Thus we will have 100 batches with batch size 10000.

 

Iterations

Iterations are the number of batches needed to complete one epoch.

For the example above, the number of iterations = the number of batches = 100. This is to say that we will need 100 iterations of batch size 10000 to complete 1 epoch.

 

Pseudocode

For (i = 0 to number_batches - 1)
    For (j = i*batch_size to (i+1)*batch_size )
        // Perfom optimization


No Comments


Post a Comment