Confusion Matrix Calculator

Calculate classification metrics and visualize confusion matrix

Input Labels

Help

What is a Confusion Matrix?

A confusion matrix is a table showing predictions vs actual values for classification:

  • True Positive (TP): Correctly predicted positive
  • True Negative (TN): Correctly predicted negative
  • False Positive (FP): Incorrectly predicted positive (Type I error)
  • False Negative (FN): Incorrectly predicted negative (Type II error)
Metrics Explained
  • Accuracy: (TP + TN) / Total - Overall correctness
  • Precision: TP / (TP + FP) - Positive prediction accuracy
  • Recall (Sensitivity): TP / (TP + FN) - True positive rate
  • F1 Score: 2 × (Precision × Recall) / (Precision + Recall) - Harmonic mean
  • Specificity: TN / (TN + FP) - True negative rate
  • MCC: Matthews Correlation Coefficient - Balanced measure
Use Cases
  • Evaluate classification model performance
  • Identify types of errors (FP vs FN)
  • Choose threshold based on business needs
  • Compare multiple classifiers
  • Report model performance to stakeholders