Skip to content

Commit c190564

Browse files
committed
freshness prebuilt docker image concept
1 parent b1218e8 commit c190564

File tree

1 file changed

+14
-16
lines changed

1 file changed

+14
-16
lines changed

articles/machine-learning/concept-prebuilt-docker-images-inference.md

Lines changed: 14 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,10 @@ ms.service: machine-learning
77
ms.subservice: inferencing
88
ms.author: sehan
99
author: dem108
10-
ms.date: 11/04/2022
11-
ms.topic: conceptual
10+
ms.date: 04/08/2024
11+
ms.topic: concept-article
1212
ms.reviewer: mopeakande
13+
reviewer: msakande
1314
ms.custom: deploy, docker, prebuilt
1415
---
1516

@@ -19,19 +20,19 @@ Prebuilt Docker container images for inference are used when deploying a model w
1920

2021
## Why should I use prebuilt images?
2122

22-
* Reduces model deployment latency.
23-
* Improves model deployment success rate.
24-
* Avoid unnecessary image build during model deployment.
25-
* Only have required dependencies and access right in the image/container. 
23+
* Reduces model deployment latency
24+
* Improves model deployment success rate
25+
* Avoids unnecessary image build during model deployment
26+
* Includes only the required dependencies and access right in the image/container
2627

2728
## List of prebuilt Docker images for inference
2829

2930
> [!IMPORTANT]
30-
> The list provided below includes only **currently supported** inference docker images by Azure Machine Learning.
31+
> The list provided in the following table includes only the inference Docker images that Azure Machine Learning **currently supports**.
3132
32-
* All the docker images run as non-root user.
33-
* We recommend using `latest` tag for docker images. Prebuilt docker images for inference are published to Microsoft container registry (MCR), to query list of tags available, follow [instructions on the GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
34-
* If you want to use a specific tag for any inference docker image, we support from `latest` to the tag that is *6 months* old from the `latest`.
33+
* All the Docker images run as non-root user.
34+
* We recommend using the `latest` tag for Docker images. Prebuilt Docker images for inference are published to the Microsoft container registry (MCR). For information on how to query the list of tags available, see the [MCR GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
35+
* If you want to use a specific tag for any inference Docker image, Azure Machine Learning supports tags that range from `latest` to *six months* older than `latest`.
3536

3637
**Inference minimal base images**
3738

@@ -42,12 +43,9 @@ NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cuda11.6.2-g
4243
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cpu-inference:latest`
4344
NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cuda11.8-gpu-inference:latest`
4445

45-
## How to use inference prebuilt docker images?
4646

47-
[Check examples in the Azure machine learning GitHub repository](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
48-
49-
## Next steps
47+
## Related content
5048

49+
* [GitHub examples of how to use inference prebuilt Docker images](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
5150
* [Deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md)
52-
* [Learn more about custom containers](how-to-deploy-custom-container.md)
53-
* [azureml-examples GitHub repository](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online)
51+
* [Use a custom container to deploy a model to an online endpoint](how-to-deploy-custom-container.md)

0 commit comments

Comments
 (0)