Replies: 3 comments 2 replies
-
@cinjon Did you manage to find a way to do this? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Yes, it's right here: detectron2/detectron2/data/build.py Line 473 in 730ccef |
Beta Was this translation helpful? Give feedback.
1 reply
-
I don't quite understand your request. If you change that to have a larger number than 1, it will increase the inference batch side when using the test dataloader function. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How do I change the batch size when running inference? No matter what I change cfg.SOLVER.IMS_PER_BATCH, it's always doing just one image:
Beta Was this translation helpful? Give feedback.
All reactions