You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If you are using Zsh, wrap `--input_shape` in quotes or use a tuple:
@@ -168,72 +188,3 @@ Run inference with a given model for 10 iterations:
168
188
--model_path=model.pte \
169
189
--num_executions=10
170
190
```
171
-
172
-
## Running Python Example with Pybinding:
173
-
174
-
You can use the `export_and_infer_openvino.py` script to run models with the OpenVINO backend through the Python bindings.
175
-
176
-
### **Usage**
177
-
178
-
#### **Command Structure**
179
-
```bash
180
-
python export_and_infer_openvino.py <ARGUMENTS>
181
-
```
182
-
183
-
#### **Arguments**
184
-
-**`--suite`** (required if `--model_path` argument is not used):
185
-
Specifies the model suite to use. Needs to be used with `--model` argument.
186
-
Supported values:
187
-
-`timm` (e.g., VGG16, ResNet50)
188
-
-`torchvision` (e.g., resnet18, mobilenet_v2)
189
-
-`huggingface` (e.g., bert-base-uncased). NB: Quantization and validation is not supported yet.
190
-
191
-
-**`--model`** (required if `--model_path` argument is not used):
192
-
Name of the model to export. Needs to be used with `--suite` argument.
193
-
Examples:
194
-
- For `timm`: `vgg16`, `resnet50`
195
-
- For `torchvision`: `resnet18`, `mobilenet_v2`
196
-
- For `huggingface`: `bert-base-uncased`, `distilbert-base-uncased`
197
-
198
-
-**`--model_path`** (required if `--suite` and `--model` arguments are not used):
199
-
Path to the saved model file. This argument allows you to load the compiled model from a file, instead of downloading it from the model suites using the `--suite` and `--model` arguments.
200
-
Example: `<path to model foler>/resnet50_fp32.pte`
201
-
202
-
-**`--input_shape`**(required for random inputs):
203
-
Input shape for the model. Provide this as a **list** or **tuple**.
204
-
Examples:
205
-
-`[1, 3, 224, 224]` (Zsh users: wrap in quotes)
206
-
-`(1, 3, 224, 224)`
207
-
208
-
-**`--input_tensor_path`**(optional):
209
-
Path to the raw input tensor file. If this argument is not provided, a random input tensor will be generated with the input shape provided with `--input_shape` argument.
210
-
Example: `<path to the input tensor foler>/input_tensor.pt`
211
-
212
-
-**`--output_tensor_path`**(optional):
213
-
Path to the file where the output raw tensor will be saved.
214
-
Example: `<path to the output tensor foler>/output_tensor.pt`
215
-
216
-
-**`--device`** (optional)
217
-
Target device for the compiled model. Default is `CPU`.
218
-
Examples: `CPU`, `GPU`
219
-
220
-
-**`--num_iter`** (optional)
221
-
Number of iterations to execute inference for evaluation. The default value is `1`.
222
-
Examples: `100`, `1000`
223
-
224
-
-**`--warmup_iter`** (optional)
225
-
Number of warmup iterations to execute inference before evaluation. The default value is `0`.
226
-
Examples: `5`, `10`
227
-
228
-
229
-
### **Examples**
230
-
231
-
#### Execute Torchvision ResNet50 model for the GPU with Random Inputs
0 commit comments