You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Prebuilt Docker container images for inference are used when deploying a model with Azure Machine Learning. The images are prebuilt with popular machine learning frameworks and Python packages. You can also extend the packages to add other packages by using one of the following methods:
19
+
Azure Machine Learning provides prebuilt Docker images for inference (scoring). These images include popular machine learning frameworks and commonly used Python packages. Extend an image to add more packages if needed.
19
20
20
-
## Why should I use prebuilt images?
21
+
## Why use prebuilt images
21
22
22
-
* Reduces model deployment latency
23
-
* Improves model deployment success rate
24
-
* Avoids unnecessary image build during model deployment
25
-
* Includes only the required dependencies and access right in the image/container
23
+
Using prebuilt images helps in several ways:
26
24
27
-
## List of prebuilt Docker images for inference
25
+
- Reduces model deployment latency
26
+
- Increases deployment success rate
27
+
- Avoids building container images during deployment
28
+
- Keeps the image small by containing only the required dependencies and minimal access rights
29
+
30
+
## List of prebuilt Docker images for inference
28
31
29
32
> [!IMPORTANT]
30
-
> The list provided in the following table includes only the inference Docker images that Azure Machine Learning **currently supports**.
33
+
> The list in the following table includes only the inference Docker images that Azure Machine Learning **currently supports**.
31
34
32
-
* All the Docker images run as non-root user.
33
-
*We recommend using the `latest` tag for Docker images. Prebuilt Docker images for inference are published to the Microsoft container registry (MCR). For information on how to query the list of tags available, see the [MCR GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
34
-
* If you want to use a specific tag for any inference Docker image, Azure Machine Learning supports tags that range from `latest`to *six months* older than `latest`.
35
+
* All images run as non-root users.
36
+
*Use the `latest` tag. Prebuilt images are published to the Microsoft Container Registry (MCR). To see available tags, go to the [MCR GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
37
+
* If you need a specific tag, Azure Machine Learning supports tags that are up to *six months* older than `latest`.
35
38
36
39
**Inference minimal base images**
37
40
38
-
Framework version | CPU/GPU | Pre-installed packages | MCR Path
39
-
--- | --- | --- | --- |
41
+
Framework version | CPU/GPU | Pre-installed packages | MCR path
42
+
--- | --- | --- | ---
40
43
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cpu-inference:latest`
41
44
NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu22.04-py39-cuda11.8-gpu-inference:latest`
42
45
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-py312-inference:latest`
43
46
44
47
> [!NOTE]
45
-
> Azure Machine Learning supports [Curated environments](resource-curated-environments.md). You can [browse curated environments](how-to-manage-environments-in-studio.md#browse-curated-environments) and add filter for`Tags: Inferencing`.
48
+
> Azure Machine Learning supports [curated environments](resource-curated-environments.md). To browse curated environments in Studio, go to [Manage environments in Studio](how-to-manage-environments-in-studio.md#browse-curated-environments) and apply the filter`Tags: Inferencing`.
46
49
47
50
## Related content
48
51
49
52
*[GitHub examples of how to use inference prebuilt Docker images](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
50
-
*[Deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md)
51
-
*[Use a custom container to deploy a model to an online endpoint](how-to-deploy-custom-container.md)
53
+
*Learn how to [deploy and score a machine learning model by using an online endpoint](how-to-deploy-online-endpoints.md).
54
+
*Discover how to [use a custom container to deploy a model to an online endpoint](how-to-deploy-custom-container.md).
0 commit comments