What is Precision and Recall? Python code Included

by keshav


Precision-Recall_tradeoff.png

After the predictive model has been finished, the most important question is: How good is it? Does it predict well? To know about this we have two important metrics called precision and recall. In this article we will learn about these two metrics in brief.

Evaluating the model is one of the most important tasks in the data science project, it indicates how good predictions are. Very often for classification problems we look at metrics called precision and recall, to define them in detail let’s quickly introduce confusion matrix first.

A confusion matrix is a table that is used to measure the performance of the machine learning classification model(typically for supervised learning, in the case of unsupervised learning is usually called the matching matrix) where output can be two or more classes. Each row of the confusion matrix represents the instances in a predicted class while each column represents the instance in an actual class or vice versa.


A confusion matrix is also known as an error matrix.

Also Read:

 In this article, we will be dealing with the various parameters of the confusion matrix and the information that we can extract from it. The structure of the confusion matrix is as shown in the figure below.

confusion Matrix

 Now let’s understand what are TP, FP, FN, TN.

Here we have two classes, Yes and No, then we define,

  1. TP-True positive: You predicted Yes class and its actual class is also Yes.
  2. TN-True negative: You predicted No class and its actual class is No.
  3. FP-False positive: You predicted the Yes class but actually it belongs to the No class. It is also called type 1 error.
  4. FN-False Negative: You predicted No class but actually it belongs to the Yes class. It is also called a type II error.

So, what are the classification performance metrics that we can calculate from the confusion matrix? Let’s see.

 

By observing the confusion matrix we can calculate the Accuracy, Recall, Precision, and F1-score(or F measure) of the classification model. Let’s understand them by taking an example of a confusion matrix.

confusion-matrix-1

 Information we obtain from the above confusion matrix:

  1. There are altogether 165 data points (i.e. observations or objects)  and they are classified into two classes Yes and No.
  2. Our classification model predicted Yes, 110 times, and No, 55 times But according to the actual classification, there are altogether 105, Yes and 60, No’s.

The confusion matrix including the above calculations is as given below,

confusion-matrix-2

Understanding the confusion matrix, calculating precision and recall is easy.

 

Precision – is the ratio of correctly predicted positive observations to the total predicted positive observations, or what percent of positive predictions were correct?

Precision = TP/TP+FP

 

Recall – also called sensitivity, is the ratio of correctly predicted positive observations to all observations in actual class – yes, or what percent of the positive cases did you catch?

Recall = TP/TP+FN

 

There are also two more useful matrices coming from confusion matrix, Accuracy – correctly predicted observation to the total observations and F1 score the weighted average of Precision and Recall. Although intuitively it is not as easy to understand as accuracy, the F1 score is usually more useful than accuracy, especially if you have an uneven class distribution.

Example Python Code to get Precision and Recall:

from sklearn.linear_model import LogisticRegression
from sklearn import datasets
from sklearn.cross_validation import train_test_split
from sklearn.metrics import precision_recall_fscore_support as score

data = datasets.load_iris()
X = data['data']
y = data['target']

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=0)

model = LogisticRegression()
model.fit(X_train,y_train)
preds = model.predict(X_test)

precision, recall, fscore, support = score(y_test, preds)

print('precision:',precision)
print('recall:',recall)


No Comments


Post a Comment