Object Detection Metrics in a Nutshell
Coming to the point..
Confidence Score — Probability that an bbox(anchor boxes, bounding boxes, I know, I know you know that) contains an object.
True Positive(TP) — model DETECTED an object and yes it is CORRECT.
False Positive(FP) — model DETECTED an object but it is INCORRECT.
False Negative(FN) — model DID’NT DETECTED an object but is INCORRECT.
True Negative(TN) — model DID’NT DETECTED an object and yes it is CORRECT. (TN = to no purpose in Object Detection)

Precision — TP/TP+FP — Of all model’s bbox predictions, how many are TP.
(How accurate your predictions are)
Recall — TP/TP+FN — Of all target bboxes , how many are TP.
(How well you find all the positives) (Disease Prediction models will prefer Recall over Precision).


mAP — Mean of Average Precision (Mean of all AP of classes, for eg. dog, cat, car etc.)
AP — Finding the area under the Precision-Recall curve. (Precision-Recall curve is drawn by plotting precision values on y axis and recall values of x axis.)

IoU — Intersection between predicted bbox and actual bbox. (A prediction is considered to be True Positive if IoU > threshold, and False Positive if IoU < threshold.)