You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/machine-learning/concept-prebuilt-docker-images-inference.md
+14-16Lines changed: 14 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,9 +7,10 @@ ms.service: machine-learning
7
7
ms.subservice: inferencing
8
8
ms.author: sehan
9
9
author: dem108
10
-
ms.date: 11/04/2022
11
-
ms.topic: conceptual
10
+
ms.date: 04/08/2024
11
+
ms.topic: concept-article
12
12
ms.reviewer: mopeakande
13
+
reviewer: msakande
13
14
ms.custom: deploy, docker, prebuilt
14
15
---
15
16
@@ -19,19 +20,19 @@ Prebuilt Docker container images for inference are used when deploying a model w
19
20
20
21
## Why should I use prebuilt images?
21
22
22
-
* Reduces model deployment latency.
23
-
* Improves model deployment success rate.
24
-
*Avoid unnecessary image build during model deployment.
25
-
*Only have required dependencies and access right in the image/container.
23
+
* Reduces model deployment latency
24
+
* Improves model deployment success rate
25
+
*Avoids unnecessary image build during model deployment
26
+
*Includes only the required dependencies and access right in the image/container
26
27
27
28
## List of prebuilt Docker images for inference
28
29
29
30
> [!IMPORTANT]
30
-
> The list provided below includes only **currently supported**inference docker images by Azure Machine Learning.
31
+
> The list provided in the following table includes only the inference Docker images that Azure Machine Learning**currently supports**.
31
32
32
-
* All the docker images run as non-root user.
33
-
* We recommend using `latest` tag for docker images. Prebuilt docker images for inference are published to Microsoft container registry (MCR), to query list of tags available, follow [instructions on the GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
34
-
* If you want to use a specific tag for any inference docker image, we support from `latest` to the tag that is *6 months*old from the`latest`.
33
+
* All the Docker images run as non-root user.
34
+
* We recommend using the `latest` tag for Docker images. Prebuilt Docker images for inference are published to the Microsoft container registry (MCR). For information on how to query the list of tags available, see the[MCR GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
35
+
* If you want to use a specific tag for any inference Docker image, Azure Machine Learning supports tags that range from `latest` to *six months*older than`latest`.
35
36
36
37
**Inference minimal base images**
37
38
@@ -42,12 +43,9 @@ NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cuda11.6.2-g
42
43
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cpu-inference:latest`
43
44
NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cuda11.8-gpu-inference:latest`
44
45
45
-
## How to use inference prebuilt docker images?
46
46
47
-
[Check examples in the Azure machine learning GitHub repository](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
48
-
49
-
## Next steps
47
+
## Related content
50
48
49
+
*[GitHub examples of how to use inference prebuilt Docker images](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
51
50
*[Deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md)
52
-
*[Learn more about custom containers](how-to-deploy-custom-container.md)
0 commit comments