Replies: 1 comment
-
For a first estimate you might want to check with the Model-Analyzer "https://github.com/openvinotoolkit/model_analyzer". |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi Everyone. I have a model converted from PyTorch to ONNX to OpenVINO. I am able to run the model on NCS2 using OpenVINO as well.
The question I have is, is there a good way to estimate the DRAM bandwidth required for the deployed model for inference on NCS2?
I know the model input size and I know the model parameters size i.e. model size in MB.
The estimated memory required at runtime should be a sum of the input image size, model weights and activations as input passes through each layer till the end.
Can someone please help/guide me as to how to get this estimate for a OpenVINO model on NCS2 ?
Thanks.
Beta Was this translation helpful? Give feedback.
All reactions