Skip to content

Commit 348a59a

Browse files
authored
Merge pull request #7262 from s-polly/stp-freshness-9-24
ML freshness pass 9-24
2 parents fbe7078 + 8edfe9e commit 348a59a

File tree

1 file changed

+21
-18
lines changed

1 file changed

+21
-18
lines changed

articles/machine-learning/concept-prebuilt-docker-images-inference.md

Lines changed: 21 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -7,45 +7,48 @@ ms.service: azure-machine-learning
77
ms.subservice: inferencing
88
ms.author: scottpolly
99
author: s-polly
10-
ms.date: 04/08/2024
10+
ms.date: 09/24/2025
1111
ms.topic: concept-article
1212
ms.reviewer: sehan
1313
ms.custom: deploy, docker, prebuilt
14+
ai-usage: ai-assisted
1415
---
1516

16-
# Prebuilt Docker images for inference
17+
# Docker images for inference
1718

18-
Prebuilt Docker container images for inference are used when deploying a model with Azure Machine Learning. The images are prebuilt with popular machine learning frameworks and Python packages. You can also extend the packages to add other packages by using one of the following methods:
19+
Azure Machine Learning provides prebuilt Docker images for inference (scoring). These images include popular machine learning frameworks and commonly used Python packages. Extend an image to add more packages if needed.
1920

20-
## Why should I use prebuilt images?
21+
## Why use prebuilt images
2122

22-
* Reduces model deployment latency
23-
* Improves model deployment success rate
24-
* Avoids unnecessary image build during model deployment
25-
* Includes only the required dependencies and access right in the image/container
23+
Using prebuilt images helps in several ways:
2624

27-
## List of prebuilt Docker images for inference
25+
- Reduces model deployment latency
26+
- Increases deployment success rate
27+
- Avoids building container images during deployment
28+
- Keeps the image small by containing only the required dependencies and minimal access rights
29+
30+
## List of prebuilt Docker images for inference
2831

2932
> [!IMPORTANT]
30-
> The list provided in the following table includes only the inference Docker images that Azure Machine Learning **currently supports**.
33+
> The list in the following table includes only the inference Docker images that Azure Machine Learning **currently supports**.
3134
32-
* All the Docker images run as non-root user.
33-
* We recommend using the `latest` tag for Docker images. Prebuilt Docker images for inference are published to the Microsoft container registry (MCR). For information on how to query the list of tags available, see the [MCR GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
34-
* If you want to use a specific tag for any inference Docker image, Azure Machine Learning supports tags that range from `latest` to *six months* older than `latest`.
35+
* All images run as non-root users.
36+
* Use the `latest` tag. Prebuilt images are published to the Microsoft Container Registry (MCR). To see available tags, go to the [MCR GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
37+
* If you need a specific tag, Azure Machine Learning supports tags that are up to *six months* older than `latest`.
3538

3639
**Inference minimal base images**
3740

38-
Framework version | CPU/GPU | Pre-installed packages | MCR Path
39-
--- | --- | --- | --- |
41+
Framework version | CPU/GPU | Pre-installed packages | MCR path
42+
--- | --- | --- | ---
4043
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cpu-inference:latest`
4144
NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cuda11.8-gpu-inference:latest`
4245
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-py312-inference:latest`
4346

4447
> [!NOTE]
45-
> Azure Machine Learning supports [Curated environments](resource-curated-environments.md). You can [browse curated environments](how-to-manage-environments-in-studio.md#browse-curated-environments) and add filter for `Tags: Inferencing`.
48+
> Azure Machine Learning supports [curated environments](resource-curated-environments.md). To browse curated environments in Studio, go to [Manage environments in Studio](how-to-manage-environments-in-studio.md#browse-curated-environments) and apply the filter `Tags: Inferencing`.
4649
4750
## Related content
4851

4952
* [GitHub examples of how to use inference prebuilt Docker images](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
50-
* [Deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md)
51-
* [Use a custom container to deploy a model to an online endpoint](how-to-deploy-custom-container.md)
53+
* Learn how to [deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md).
54+
* Discover how to [use a custom container to deploy a model to an online endpoint](how-to-deploy-custom-container.md).

0 commit comments

Comments
 (0)