Object Detection Metrics in a Nutshell

Ritik Vaidande
2 min readDec 12, 2021

Coming to the point..

Confidence Score — Probability that an bbox(anchor boxes, bounding boxes, I know, I know you know that) contains an object.

True Positive(TP) — model DETECTED an object and yes it is CORRECT.
False Positive(FP) — model DETECTED an object but it is INCORRECT.
False Negative(FN) — model DID’NT DETECTED an object but is INCORRECT.
True Negative(TN) — model DID’NT DETECTED an object and yes it is CORRECT. (TN = to no purpose in Object Detection)

Precision — TP/TP+FP — Of all model’s bbox predictions, how many are TP.
(How accurate your predictions are)
Recall — TP/TP+FN — Of all target bboxes , how many are TP.
(How well you find all the positives) (Disease Prediction models will prefer Recall over Precision).

mAP — Mean of Average Precision (Mean of all AP of classes, for eg. dog, cat, car etc.)
AP — Finding the area under the Precision-Recall curve. (Precision-Recall curve is drawn by plotting precision values on y axis and recall values of x axis.)

IoU — Intersection between predicted bbox and actual bbox. (A prediction is considered to be True Positive if IoU > threshold, and False Positive if IoU < threshold.)

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

No responses yet

Write a response