Free previous model vram when switch to another model? #1709
Closed
jmjoy
started this conversation in
Ideas / Feature requests
Replies: 2 comments
-
|
i'll check if we are releasing the memory when switching the model, which I intended |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@jmjoy thanks for the report, this issue should be resolved by #1715, please let me know if you still observe an issue 🙏 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
When I switch model
sam2:largetosam2:latest, I got error message:There are two model in used, cause vram OOM and fallback to
CPUExecutionProvider, which is very slow.So, is better to free previous model vram when switch to another model?
Beta Was this translation helpful? Give feedback.
All reactions