Skip to content

Commit 7ac3722

Browse files
Merge pull request #3410 from Blackmist/402351-fresh
freshness
2 parents 17ba81b + 1428ab7 commit 7ac3722

File tree

1 file changed

+11
-7
lines changed

1 file changed

+11
-7
lines changed

articles/machine-learning/v1/how-to-extend-prebuilt-docker-image-inference.md

Lines changed: 11 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,34 +7,35 @@ ms.service: azure-machine-learning
77
ms.subservice: inferencing
88
ms.author: larryfr
99
author: Blackmist
10-
ms.date: 10/21/2021
10+
ms.date: 03/07/2025
1111
ms.topic: how-to
1212
ms.reviewer: sehan
1313
ms.custom: UpdateFrequency5, deploy, docker, prebuilt
1414
---
1515

1616
# Extend a prebuilt Docker image
1717

18-
In some cases, the [prebuilt Docker images for model inference](../concept-prebuilt-docker-images-inference.md) and [extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) solutions for Azure Machine Learning may not meet your inference service needs.
18+
In some cases, the [prebuilt Docker images for model inference](../concept-prebuilt-docker-images-inference.md) and [extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) solutions for Azure Machine Learning might not meet your inference service needs.
1919

2020
In this case, you can use a Dockerfile to create a new image, using one of the prebuilt images as the starting point. By extending from an existing prebuilt Docker image, you can use the Azure Machine Learning network stack and libraries without creating an image from scratch.
2121

2222
**Benefits and tradeoffs**
2323

2424
Using a Dockerfile allows for full customization of the image before deployment. It allows you to have maximum control over what dependencies or environment variables, among other things, are set in the container.
2525

26-
The main tradeoff for this approach is that an extra image build will take place during deployment, which slows down the deployment process. If you can use the [Python package extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) method, deployment will be faster.
26+
The main tradeoff for this approach is that an extra image build takes place during deployment, which slows down the deployment process. If you can use the [Python package extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) method, deployment is faster.
2727
## Prerequisites
2828

2929
* An Azure Machine Learning workspace. For a tutorial on creating a workspace, see [Create resources to get started](../quickstart-create-resources.md).
3030
* Familiarity with authoring a [Dockerfile](https://docs.docker.com/engine/reference/builder/).
3131
* Either a local working installation of [Docker](https://www.docker.com/), including the `docker` CLI, **OR** an Azure Container Registry (ACR) associated with your Azure Machine Learning workspace.
3232

3333
> [!WARNING]
34-
> The Azure Container Registry for your workspace is created the first time you train or deploy a model using the workspace. If you've created a new workspace, but not trained or created a model, no Azure Container Registry will exist for the workspace.
34+
> The Azure Container Registry for your workspace is created the first time you train or deploy a model using the workspace. If you created a new workspace, but not trained or created a model, no Azure Container Registry exists for the workspace.
35+
3536
## Create and build Dockerfile
3637

37-
Below is a sample Dockerfile that uses an Azure Machine Learning prebuilt Docker image as a base image:
38+
The following sample is a Dockerfile that uses an Azure Machine Learning prebuilt Docker image as a base image:
3839

3940
```Dockerfile
4041
FROM mcr.microsoft.com/azureml/<image_name>:<tag>
@@ -95,7 +96,7 @@ RUN pip install <library>
9596
If the model and code need to be built into the image, the following environment variables need to be set in the Dockerfile:
9697

9798
* `AZUREML_ENTRY_SCRIPT`: The entry script of your code. This file contains the `init()` and `run()` methods.
98-
* `AZUREML_MODEL_DIR`: The directory that contains the model file(s). The entry script should use this directory as the root directory of the model.
99+
* `AZUREML_MODEL_DIR`: The directory that contains the model files. The entry script should use this directory as the root directory of the model.
99100

100101
The following example demonstrates setting these environment variables in the Dockerfile:
101102

@@ -115,8 +116,11 @@ ENV AZUREML_MODEL_DIR=/var/azureml-app/azureml-models
115116

116117
The following example demonstrates installing `apt` packages, setting environment variables, and including code and models as part of the Dockerfile:
117118

119+
> [!NOTE]
120+
> The following example uses the `mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cpu-inference:latest` image as a base image. For information on the available images, see [Prebuilt Docker images for model inference](../concept-prebuilt-docker-images-inference.md#list-of-prebuilt-docker-images-for-inference).
121+
118122
```Dockerfile
119-
FROM mcr.microsoft.com/azureml/pytorch-1.6-ubuntu18.04-py37-cpu-inference:latest
123+
FROM mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cpu-inference:latest
120124

121125
USER root:root
122126

0 commit comments

Comments
 (0)