What Is a Confusion Matrix?
A confusion matrix is a compact tabular summary of a classification model's predictions against the ground truth. It lays the foundation for nearly every classification metric in machine learning — from accuracy and precision to F1 score and Matthews correlation coefficient (MCC). This confusion matrix calculator lets you enter your TP, FP, TN, and FN values and instantly see all derived metrics with full formula derivations.
Whether you're a data scientist validating a model, a student studying ML fundamentals, or a practitioner comparing classifier performance, this interactive tool gives you instant, accurate results with transparent math — no signup or login required.
Interactive Confusion Matrix Calculator
Enter your four confusion matrix values below. All metrics update in real time with formula derivations.
Enter Values
Derived Metrics
Formula Derivations
Visual Explainer: How the Confusion Matrix Works
The confusion matrix aligns actual classes (rows) against predicted classes (columns). Each cell counts the number of instances that fall into that combination:
- TP — Correctly predicted positive
- FP — Incorrectly predicted positive (Type I error)
- FN — Missed positive (Type II error)
- TN — Correctly predicted negative
These four numbers are all you need to compute every major classification metric. The confusion matrix calculator above does the heavy lifting — just enter your counts and read the results.
Understanding Each Metric
This confusion matrix calculator computes seven essential metrics. Here's what each one means and why it matters:
Precision (PPV)
Of all instances predicted positive, what fraction were actually positive? High precision minimizes false positives — critical in spam detection or medical screening where false alarms are costly.