a way to reduce vram usage 2.7 times, from @lmxyy at mit-han-lab #6580
frankyifei
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Should be easy to implement, reduce vram usage up to 2.7 times,
All you need to do is use convert_model to wrap all the Conv2d in your PyTorch model to our PatchConv. For example,
link
Beta Was this translation helpful? Give feedback.
All reactions