Home > Article > Technology peripherals > The relationship between confusion matrix and precision, recall, accuracy and F-Measure
The confusion matrix is a powerful predictive analysis tool in machine learning that is used to summarize the number of correct and incorrect predictions by a classifier in a binary classification task.
Simply put, "the confusion matrix is a performance measure of the machine learning algorithm."
By visualizing the confusion matrix, we can observe the diagonal values to determine model accuracy and evaluate the number of accurate classifications.
If you consider the structure of the matrix, the size of the matrix is proportional to the number of output classes.
The confusion matrix is in matrix form, with columns representing predicted values and rows representing actual values, summarizing the prediction results of the classification model.
Measuring the confusion matrix helps evaluate the accuracy and error types of a classification model.
1. Provide information about the errors made by the classifier and the types of errors made.
2. Reflect how confusing the classification model is when making predictions.
3. Helps overcome the limitations of classification accuracy of separate deployments.
4. The confusion matrix is very suitable for calculating recall, precision, accuracy and AUC-ROC curves.
1. Precision: Precision explains how many correctly predicted values are actually positive. Or simply put, it gives the number of correct outputs given by the model out of all correctly predicted positive values.
It determines whether the model is reliable, and the formula for calculating accuracy is TP/(TP FP).
2. Recall rate: Recall describes the number of actual positive values correctly predicted from the model. The formula for calculating recall is TP/(TP FN).
Increasing precision reduces recall and vice versa, this is called the precision/recall trade-off.
3. Accuracy: It is one of the important parameters to determine the accuracy of the classification problem. It explains the frequency of the correct output predicted by the model, and can measure the number of correct predictions made for the classifier compared with the number of correct predictions made by the classifier. ratio of the total number of predictions made by the processor. Classifier. The formula is;
Precision: (TP TN)/(TP TN FP FN)
4. F-measure: For two models with low precision and high recall or high precision and low recall In this case, it is difficult to compare these models, so to solve this problem, we can deploy F-score. F-score is the harmonic mean of precision and recall.
By calculating F-score, we can evaluate both recall and precision. Furthermore, if the recall is equal to the precision, the F-score is maximum and can be calculated using the following formula: F-measure= (2*Recall*precision)/ (Recall Precision)
The above is the detailed content of The relationship between confusion matrix and precision, recall, accuracy and F-Measure. For more information, please follow other related articles on the PHP Chinese website!