Return to page

WIKI

Confusion Matrix

What is the Confusion Matrix?

A confusion matrix is a useful machine learning method that allows you to measure recall, precision, accuracy, and AUC-ROC curve. The confusion matrix is a systematic way to allocate the predictions to the original classes to which the data originally belonged. A confusion matrix is also a performance measurement technique for machine learning classification. If you train a machine learning classification model on a dataset, the resulting confusion matrix will show how accurately the model categorized each record and where there might be errors. The matrix rows represent the actual labels contained in the training dataset, and the matrix columns represent the outcomes.

 

Examples of Confusion Matrix

A confusion matrix aids in measuring performance when an algorithm's output can be classified as positive or negative; yes or no. Each table has four cells, each of which represents a unique mix of expected and actual values. The four possible results are as follows:

  • True Positive (TP): It denotes that a positive prediction was made and then came true. It is sometimes referred to as sensitivity.

  • True Negative (TN): It signifies a negative prediction was made and then came true. It is referred to as specificity.

  • False Positive (FP): Although the prediction was positive, the actual value was negative. It is frequently referred to as a Type-I error.

  • False Negative(FN): Although the forecast was negative, the actual value was positive. It is sometimes referred to as a Type-II error. 

     

Why is the Confusion Matrix Important?

Confusion matrices reveal when a model consistently confuses two classes, making it simple to determine how reliable a model's results are likely to be. The effectiveness of a classification model, enabling business users to identify which data their model might be unable to accurately categorize. When applying insights or predictions from the model to real-world business choices, this knowledge is important.

For instance, there is a significantly different outcome when a model predicts that a credit investment opportunity would result in default when it really didn't (false positive) than when the lender unintentionally advances a loan that actually results in a default (false negative). The user should use an alternative model or manually tune their current model if they can see from the confusion matrix that their model is likely to produce false negatives for the loan dataset.

 

How is the Confusion Matrix Used?

A confusion matrix is employed to evaluate the classification models' performance. Now let's examine the four primary factors that are essential to its process.

Accuracy: It is the most commonly used parameter for evaluating a machine learning model. A large probability exists that the ML model will have an accuracy score of 70%, for instance, if 70% of examples are false and just 30% are correct. (TP+TN)/(TP+FP+FN+TN) is the equation to calculate accuracy.

Precision: It is defined as the ratio of true positives to total positives predicted by the machine learning model. Precision is expressed as TP/(TP+FP). This indicator calculates the likelihood of positive prediction to be true.

Recall: Sensitivity or recall is the ratio of the TP to the number of actual positive outcomes. The Recall formula is TP/(TP+FN). This parameter examines the ML model's ability to study the input and identify the real outcome.

F1 Score: The F1 score is calculated using the harmonic mean of recall and accuracy. It is utilized as an overall indication that combines precision and recall. An unbalanced dataset responds well to this harmonic mean's analysis of false positives and false negatives. It can be calculated using the formula 2(p*r)/(p+r), where r stands for recall and p for precision.

 

Confusion Matrix vs Other Technologies & Methodologies

Confusion matrix vs correlation matrix

A confusion matrix is a specific table layout that allows visualization of the performance of an algorithm, typically a supervised learning. A correlation matrix is a table showing correlation coefficients between variables.

Confusion matrix vs cost matrix

A confusion matrix measures accuracy, which is the ratio of correct predictions to the total number of predictions. A cost matrix is used to specify the relative importance of accuracy for different predictions.

Confusion matrix vs AUC

AUC shows how successful a model is at separating positive and negative classes. A confusion matrix is not a metric to evaluate a model; rather, it provides insight into the predictions.

 

H2O Driverless AI includes classification metric plots. The Confusion Matrix is one of the included metric plots. In the Confusion Matrix graph, the threshold value defaults to 0.5. For binary classification experiments, users can specify a different threshold value. The threshold selector is available after clicking on the Confusion Matrix and opening the enlarged view. When you specify a value or change the slider value, Driverless AI automatically computes a diagnostic Confusion Matrix for that given threshold value.