You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: examples/openvino/README.md
+16-11Lines changed: 16 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -9,8 +9,7 @@ Below is the layout of the `examples/openvino` directory, which includes the nec
9
9
```
10
10
examples/openvino
11
11
├── README.md # Documentation for examples (this file)
12
-
├── aot_openvino_compiler.py # Example script for AoT export
13
-
└── export_and_infer_openvino.py # Example script to export and execute models with python bindings
12
+
└── aot_optimize_and_infer.py # Example script to export and execute models
14
13
```
15
14
16
15
# Build Instructions for Examples
@@ -20,13 +19,13 @@ Follow the [instructions](../../backends/openvino/README.md) of **Prerequisites*
20
19
21
20
## AOT step:
22
21
23
-
The export script called `aot_openvino_compiler.py` allows users to export deep learning models from various model suites (TIMM, Torchvision, Hugging Face) to a openvino backend using **Executorch**. Users can dynamically specify the model, input shape, and target device.
22
+
The export script called `aot_optimize_and_infer.py` allows users to export deep learning models from various model suites (TIMM, Torchvision, Hugging Face) to a openvino backend using **Executorch**. Users can dynamically specify the model, input shape, and target device.
0 commit comments