confusion_matrix / README.md
Achilleas Pasias
Update Readme
f9335e1

A newer version of the Gradio SDK is available: 5.5.0

Upgrade
metadata
title: Confusion Matrix
emoji: πŸ“‰
colorFrom: yellow
colorTo: green
sdk: gradio
sdk_version: 3.17.0
app_file: app.py
pinned: false
tags:
  - evaluate
  - metric
description: >-
  Compute confusion matrix to evaluate the accuracy of a classification. By
  definition a confusion matrix :math:C is such that :math:C_{i, j} is equal to
  the number of observations known to be in group :math:i and predicted to be in
  group :math:j. Thus in binary classification, the count of true negatives is
  :math:C_{0,0}, false negatives is :math:C_{1,0}, true positives is
  :math:C_{1,1} and false positives is :math:C_{0,1}.

Metric Card for Confusion Matrix

Metric Description

Compute confusion matrix to evaluate the accuracy of a classification. By definition a confusion matrix :math:C is such that :math:C_{i, j} is equal to the number of observations known to be in group :math:i and predicted to be in group :math:j.

Thus in binary classification, the count of true negatives is :math:C_{0,0}, false negatives is :math:C_{1,0}, true positives is :math:C_{1,1} and false positives is :math:C_{0,1}.

How to Use

At minimum, this metric requires predictions and references as inputs.

>>> cfm_metric = evaluate.load("BucketHeadP65/confusion_matrix")
>>> results = cfm_metric.compute(references=[1, 2, 3, 2, 1, 1, 0, 2], predictions=[1, 0, 3, 2, 2, 1, 0, 3])
>>> print(results)
{'confusion_matrix': [[1, 0, 0, 0], [0, 2, 1, 0], [1, 0, 1, 1], [0, 0, 0, 1]]}

Inputs

  • predictions (list of int): Predicted labels.
  • references (list of int): Ground truth labels.
  • normalize (str or None): {true, pred, all}, default=None Normalizes confusion matrix over the true (rows), predicted (columns) conditions or all the population. If None, confusion matrix will not be normalized
  • sample_weight (list of float): Sample weights Defaults to None.
  • labels (list of float): default=None List of labels to index the matrix. This may be used to reorder or select a subset of labels. If None is given, those that appear at least once in y_true or y_pred are used in sorted order.

Output Values

  • confusion_matrix(list of int): Confusion matrix. Minimum possible value is 0. Maximum possible value is 1.0, or the number of examples input, if normalize is set to True.. A higher score means higher accuracy. Output Example(s):
{'confusion_matrix': [[1, 0, 0, 0], [0, 2, 1, 0], [1, 0, 1, 1], [0, 0, 0, 1]]}

This metric outputs a dictionary, containing the confusion matrix.

Examples

from sklearn.metrics import confusion_matrix y_true = [2, 0, 2, 2, 0, 1] y_pred = [0, 0, 2, 2, 0, 2] confusion_matrix(y_true, y_pred) array([[2, 0, 0], [0, 0, 1], [1, 0, 2]])

y_true = ["cat", "ant", "cat", "cat", "ant", "bird"] y_pred = ["ant", "ant", "cat", "cat", "ant", "cat"] confusion_matrix(y_true, y_pred, labels=["ant", "bird", "cat"]) array([[2, 0, 0], [0, 0, 1], [1, 0, 2]])

In the binary case, we can extract true positives, etc as follows:

tn, fp, fn, tp = confusion_matrix([0, 1, 0, 1], [1, 1, 1, 0]).ravel() (tn, fp, fn, tp) (0, 2, 1, 1)

Citation(s)

@article{scikit-learn,
  title={Scikit-learn: Machine Learning in {P}ython},
  author={Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V.
         and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P.
         and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and
         Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E.},
  journal={Journal of Machine Learning Research},
  volume={12},
  pages={2825--2830},
  year={2011}
}

Further References

Wikipedia entry for the Confusion matrix https://en.wikipedia.org/wiki/Confusion_matrix`_ (Wikipedia and other references may use a different convention for axes).