Intersection Over Union (IOU) is a measure of the overlap between the predicted bounding box and the ground truth label with the same/correct class.

By comparing the IOU to the threshold*, we can tell if a detection is correct (True Positive) or wrong (False Positive).
| IOU = 98% | IOU = 40% | IOU = 0% |
|---|---|---|
![]() |
![]() |
![]() |
| ≥ threshold* | < threshold* | |
| True Positive | False Positive | False Negative |
Average Precision is a useful metric for model accuracy. It represents the area under the precision-recall curve and measures how well your model performs across all score thresholds. The closer to 1.0 this score is, the better your model is performing on the test set; a model guessing at random for each label would get an average precision around 0.5.