What is the IOU interval for AP for bbox eval? #8959
Unanswered
pjoyce1995
asked this question in
Q&A
Replies: 1 comment
-
Most evaluation is driven by the dataset you are evaluating against and in the corresponding code. For COCO you are correct: mmdetection/mmdet/datasets/coco.py Lines 616 to 620 in 9d3e162 For the mmdet CustomDataset it is here: mmdetection/mmdet/datasets/custom.py Line 327 in 9d3e162 It is whatever you configure with your test dataset for the evaluate function to use. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am referring to the AP listed in the table broken down by category.
I cannot find the appropriate directory to find this myself. It seems that the mean_ap.py code in mmdet.core.evaluation isn't used when running ./tools/test.py
The mean_ap.py calculates AP by the '11 points' as the average between IOU of 0 to 1, but other sources say that COCO uses IOU between 0.5 and 0.95.
Beta Was this translation helpful? Give feedback.
All reactions