Skip to content

Latest commit

 

History

History
54 lines (43 loc) · 2.44 KB

File metadata and controls

54 lines (43 loc) · 2.44 KB

Development Notes

Notes on Object Detection Models

  • Region Proposal Methods
    • Selective Search
    • Edge Boxes
    • Region Proposal Networks (RPN)
    • Superpixels

  • Dataset Preparation
  • Neural Network Architecture Selection
    • Propósal (Two-Stage)
      • RCNN
      • Fast RCNN
      • Faster RCNN
      • Mask RCNN
      • RFCN
    • Proposal-Free (One-Stage)
      • YOLO
      • SSD
  • Model Training
  • Inference
  • Evaluation
  • Results

YOLO vs RCNN

  • YOLO is faster than RCNN
  • RCNN is more accurate than YOLO
  • YOLO is better for real-time applications

Notes on Object Detection Metrics

  • IoU, intersection over union, area of overlap divided by area of union
  • AP, average precision: varying different thresholds for the IoU
  • AP50, average precision at 50% IoU
  • AP75, average precision at 75% IoU, it is more strict
  • APs, APm, APl, average precision for small, medium, large objects

In recent years, the most frequently used evaluation for detection is "Average Precision (AP)", which was originally introduced in VOC2007. AP is defined as the average detection precision under different recalls and is usually evaluated in a category-specific manner. The mean AP (mAP) averaged over all categories is typically used as the final metric of performance. To measure object localization accuracy, the Intersection over Union (IoU) between the predicted box and the ground truth is used to verify whether it is greater than a predefined threshold, such as 0.5. If it is, the object is identified as "detected"; otherwise, it is considered "missed". The 0.5-IoU mAP has then become the de facto metric for object detection​(nodes)​.

Citing/reference: arXiv:1905.05055v3 [cs.CV] 18 Jan 2023

Development References