Skip to content

Latest commit

 

History

History
52 lines (29 loc) · 1.87 KB

04-precision-recall.md

File metadata and controls

52 lines (29 loc) · 1.87 KB

4.4 Precision and Recall

Slides

Notes

Precision tell us the fraction of positive predictions that are correct. It takes into account only the positive class (TP and FP - second column of the confusion matrix), as is stated in the following formula:

$$P = \cfrac{TP}{TP + FP}$$

Recall measures the fraction of correctly identified postive instances. It considers parts of the postive and negative classes (TP and FN - second row of confusion table). The formula of this metric is presented below:

$$R = \cfrac{TP}{TP + FN}$$

In this problem, the precision and recall values were 67% and 54% respectively. So, these measures reflect some errors of our model that accuracy did not notice due to the class imbalance.

classification_metrics.png

MNEMONICS:

  • Precision : From the predicted positives, how many we predicted right. See how the word precision is similar to the word prediction?

  • Recall : From the real positives, how many we predicted right. See how the word recall is similar to the word real?

Add notes from the video (PRs are welcome)

⚠️ The notes are written by the community.
If you see an error here, please create a PR with a fix.

Navigation