You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In some cases, the [prebuilt Docker images for model inference](../concept-prebuilt-docker-images-inference.md) and [extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) solutions for Azure Machine Learning may not meet your inference service needs.
18
+
In some cases, the [prebuilt Docker images for model inference](../concept-prebuilt-docker-images-inference.md) and [extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) solutions for Azure Machine Learning might not meet your inference service needs.
19
19
20
20
In this case, you can use a Dockerfile to create a new image, using one of the prebuilt images as the starting point. By extending from an existing prebuilt Docker image, you can use the Azure Machine Learning network stack and libraries without creating an image from scratch.
21
21
22
22
**Benefits and tradeoffs**
23
23
24
24
Using a Dockerfile allows for full customization of the image before deployment. It allows you to have maximum control over what dependencies or environment variables, among other things, are set in the container.
25
25
26
-
The main tradeoff for this approach is that an extra image build will take place during deployment, which slows down the deployment process. If you can use the [Python package extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) method, deployment will be faster.
26
+
The main tradeoff for this approach is that an extra image build takes place during deployment, which slows down the deployment process. If you can use the [Python package extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) method, deployment is faster.
27
27
## Prerequisites
28
28
29
29
* An Azure Machine Learning workspace. For a tutorial on creating a workspace, see [Create resources to get started](../quickstart-create-resources.md).
30
30
* Familiarity with authoring a [Dockerfile](https://docs.docker.com/engine/reference/builder/).
31
31
* Either a local working installation of [Docker](https://www.docker.com/), including the `docker` CLI, **OR** an Azure Container Registry (ACR) associated with your Azure Machine Learning workspace.
32
32
33
33
> [!WARNING]
34
-
> The Azure Container Registry for your workspace is created the first time you train or deploy a model using the workspace. If you've created a new workspace, but not trained or created a model, no Azure Container Registry will exist for the workspace.
34
+
> The Azure Container Registry for your workspace is created the first time you train or deploy a model using the workspace. If you created a new workspace, but not trained or created a model, no Azure Container Registry exists for the workspace.
35
+
35
36
## Create and build Dockerfile
36
37
37
-
Below is a sample Dockerfile that uses an Azure Machine Learning prebuilt Docker image as a base image:
38
+
The following sample is a Dockerfile that uses an Azure Machine Learning prebuilt Docker image as a base image:
38
39
39
40
```Dockerfile
40
41
FROM mcr.microsoft.com/azureml/<image_name>:<tag>
@@ -95,7 +96,7 @@ RUN pip install <library>
95
96
If the model and code need to be built into the image, the following environment variables need to be set in the Dockerfile:
96
97
97
98
*`AZUREML_ENTRY_SCRIPT`: The entry script of your code. This file contains the `init()` and `run()` methods.
98
-
*`AZUREML_MODEL_DIR`: The directory that contains the model file(s). The entry script should use this directory as the root directory of the model.
99
+
*`AZUREML_MODEL_DIR`: The directory that contains the model files. The entry script should use this directory as the root directory of the model.
99
100
100
101
The following example demonstrates setting these environment variables in the Dockerfile:
The following example demonstrates installing `apt` packages, setting environment variables, and including code and models as part of the Dockerfile:
117
118
119
+
> [!NOTE]
120
+
> The following example uses the `mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cpu-inference:latest` image as a base image. For information on the available images, see [Prebuilt Docker images for model inference](../concept-prebuilt-docker-images-inference.md#list-of-prebuilt-docker-images-for-inference).
121
+
118
122
```Dockerfile
119
-
FROM mcr.microsoft.com/azureml/pytorch-1.6-ubuntu18.04-py37-cpu-inference:latest
123
+
FROM mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cpu-inference:latest
0 commit comments