Replies: 1 comment
-
when use origin run_on_image method,the multi gpus inference time more than single gpu time. do you some question? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Given a directory of images, I wanted to run model inference using multiple GPUs. Using the
run_on_video
function as a template, I wrote therun_on_images
function forVisualisationDemo
as follows:It seems to be working as expected but at the end of the iteration just before the function terminates, I get this warning:
Does anyone have any idea how to fix this?
Beta Was this translation helpful? Give feedback.
All reactions