You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this article, you will learn how to use Open Neural Network Exchange (ONNX) to make predictions on computer vision models generated from automated machine learning (AutoML) in Azure Machine Learning.
20
+
In this article, you'll learn how to use Open Neural Network Exchange (ONNX) to make predictions on computer vision models generated from automated machine learning (AutoML) in Azure Machine Learning.
21
21
22
22
To use ONNX for predictions, you need to:
23
23
@@ -31,7 +31,7 @@ To use ONNX for predictions, you need to:
31
31
32
32
[ONNX Runtime](https://onnxruntime.ai/index.html) is an open-source project that supports cross-platform inference. ONNX Runtime provides APIs across programming languages (including Python, C++, C#, C, Java, and JavaScript). You can use these APIs to perform inference on input images. After you have the model that has been exported to ONNX format, you can use these APIs on any programming language that your project needs.
33
33
34
-
In this guide, you'll learn how to use [Python APIs for ONNX Runtime](https://onnxruntime.ai/docs/get-started/with-python.html) to make predictions on images for popular vision tasks. You can use these ONNX exported models across languages.
34
+
In this guide, you learn how to use [Python APIs for ONNX Runtime](https://onnxruntime.ai/docs/get-started/with-python.html) to make predictions on images for popular vision tasks. You can use these ONNX exported models across languages.
After the model downloading step, you use the ONNX Runtime Python package to perform inferencing by using the *model.onnx* file. For demonstration purposes, this article uses the datasets from [How to prepare image datasets](how-to-prepare-datasets-for-automl-images.md) for each vision task.
284
284
285
-
We've trained the models for all vision tasks with their respective datasets to demonstrate ONNX model inference.
285
+
We trained the models for all vision tasks with their respective datasets to demonstrate ONNX model inference.
286
286
287
287
## Load the labels and ONNX model files
288
288
@@ -393,11 +393,11 @@ The output is a tuple of `output_names` and predictions. Here, `output_names` an
393
393
394
394
| Output name | Output shape | Output type | Description |
395
395
| -------- |----------|-----|------|
396
-
|`output_names`|`(3*batch_size)`| List of keys | For a batch size of 2, `output_names`will be`['boxes_0', 'labels_0', 'scores_0', 'boxes_1', 'labels_1', 'scores_1']`|
397
-
|`predictions`|`(3*batch_size)`| List of ndarray(float) | For a batch size of 2, `predictions`will take the shape of `[(n1_boxes, 4), (n1_boxes), (n1_boxes), (n2_boxes, 4), (n2_boxes), (n2_boxes)]`. Here, values at each index correspond to same index in `output_names`. |
396
+
|`output_names`|`(3*batch_size)`| List of keys | For a batch size of 2, `output_names`is`['boxes_0', 'labels_0', 'scores_0', 'boxes_1', 'labels_1', 'scores_1']`|
397
+
|`predictions`|`(3*batch_size)`| List of ndarray(float) | For a batch size of 2, `predictions`takes the shape of `[(n1_boxes, 4), (n1_boxes), (n1_boxes), (n2_boxes, 4), (n2_boxes), (n2_boxes)]`. Here, values at each index correspond to same index in `output_names`. |
398
398
399
399
400
-
The following table describes boxes, labels and scores returned for each sample in the batch of images.
400
+
The following table describes boxes, labels, and scores returned for each sample in the batch of images.
401
401
402
402
| Name | Shape | Type | Description |
403
403
| -------- |----------|-----|------|
@@ -419,7 +419,7 @@ The input is a preprocessed image, with the shape `(1, 3, 640, 640)` for a batch
419
419
| Input |`(batch_size, num_channels, height, width)`| ndarray(float) | Input is a preprocessed image, with the shape `(1, 3, 640, 640)` for a batch size of 1, and a height of 640 and width of 640.|
420
420
421
421
### Output format
422
-
ONNX model predictions contain multiple outputs. The first output is needed to perform non-max suppression for detections. For ease of use, automated ML displays the output format after the NMS postprocessing step. The output after NMS is a list of boxes, labels, and scores for each sample in the batch.
422
+
ONNX model predictions contain multiple outputs. The first output is needed to perform nonmax suppression for detections. For ease of use, automated ML displays the output format after the NMS postprocessing step. The output after NMS is a list of boxes, labels, and scores for each sample in the batch.
423
423
424
424
425
425
| Output name | Output shape | Output type | Description |
@@ -450,8 +450,8 @@ The output is a tuple of `output_names` and predictions. Here, `output_names` an
450
450
451
451
| Output name | Output shape | Output type | Description |
452
452
| -------- |----------|-----|------|
453
-
|`output_names`|`(4*batch_size)`| List of keys | For a batch size of 2, `output_names`will be`['boxes_0', 'labels_0', 'scores_0', 'masks_0', 'boxes_1', 'labels_1', 'scores_1', 'masks_1']`|
454
-
|`predictions`|`(4*batch_size)`| List of ndarray(float) | For a batch size of 2, `predictions`will take the shape of `[(n1_boxes, 4), (n1_boxes), (n1_boxes), (n1_boxes, 1, height_onnx, width_onnx), (n2_boxes, 4), (n2_boxes), (n2_boxes), (n2_boxes, 1, height_onnx, width_onnx)]`. Here, values at each index correspond to same index in `output_names`. |
453
+
|`output_names`|`(4*batch_size)`| List of keys | For a batch size of 2, `output_names`is`['boxes_0', 'labels_0', 'scores_0', 'masks_0', 'boxes_1', 'labels_1', 'scores_1', 'masks_1']`|
454
+
|`predictions`|`(4*batch_size)`| List of ndarray(float) | For a batch size of 2, `predictions`takes the shape of `[(n1_boxes, 4), (n1_boxes), (n1_boxes), (n1_boxes, 1, height_onnx, width_onnx), (n2_boxes, 4), (n2_boxes), (n2_boxes), (n2_boxes, 1, height_onnx, width_onnx)]`. Here, values at each index correspond to same index in `output_names`. |
0 commit comments