You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Prebuilt Docker container images for inference are used when deploying a model with Azure Machine Learning. The images are prebuilt with popular machine learning frameworks and Python packages. You can also extend the packages to add other packages by using one of the following methods:
19
19
20
-
*[Use prebuilt inference image as base for a new Dockerfile](how-to-extend-prebuilt-docker-image-inference.md). Using this method, you can install both **Python packages and apt packages**.
21
-
22
20
## Why should I use prebuilt images?
23
21
24
22
* Reduces model deployment latency.
@@ -29,11 +27,16 @@ Prebuilt Docker container images for inference are used when deploying a model w
29
27
## List of prebuilt Docker images for inference
30
28
31
29
> [!IMPORTANT]
32
-
> The list provided below includes only **currently supported** inference images by Azure Machine Learning.
30
+
> The list provided below includes only **currently supported** inference docker images by Azure Machine Learning.
[Check examples in the Azure machine learning GitHub repository](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
37
+
36
38
## Next steps
37
39
38
-
*[Add Python packages to prebuilt images](how-to-prebuilt-docker-images-inference-python-extensibility.md).
39
-
*[Use a prebuilt package as a base for a new Dockerfile](how-to-extend-prebuilt-docker-image-inference.md).
40
+
*[Deploy and score a machine learning model by using an online endpoint](how-to-deploy-managed-online-endpoints.md)
41
+
*[Learn more about custom containers](how-to-deploy-custom-container.md)
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-inference-server-http.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -175,7 +175,7 @@ You have **Flask 2** installed in your python environment but are running a serv
175
175
ImportError: cannot import name 'Markup' from 'jinja2'
176
176
```
177
177
178
-
Some older versions of the server do not have the correct boundaries for these dependencies. Please upgrade to the latest version of the server. We also recommend to not include these packages in their dependencies and rely on our server package to install the compatible versions of these dependencies.
178
+
Older versions (<= 0.4.10) of the server did not pin Flask's dependency to compatible versions. This is fixed inthe latest version of the server.
179
179
180
180
### 3. Do I need to reload the server when changing the score script?
181
181
@@ -187,5 +187,5 @@ The Azure Machine Learning inference server runs on Windows & Linux based operat
187
187
188
188
## Next steps
189
189
190
-
* For more information on creating an entry script and deploying models, see [How to deploy a model using Azure Machine Learning](how-to-deploy-and-where.md).
190
+
* For more information on creating an entry script and deploying models, see [How to deploy a model using Azure Machine Learning](how-to-deploy-managed-online-endpoints.md).
191
191
* Learn about [Prebuilt docker images for inference](concept-prebuilt-docker-images-inference.md)
Copy file name to clipboardExpand all lines: includes/aml-inference-list-prebuilt-docker-images.md
+2-35Lines changed: 2 additions & 35 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,45 +12,12 @@ ms.date: 07/14/2022
12
12
13
13
* All the docker images run as non-root user.
14
14
* We recommend using `latest` tag for docker images. Prebuilt docker images for inference are published to Microsoft container registry (MCR), to query list of tags available, follow [instructions on the GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
15
+
* If you want to use a specific tag for any inference docker image, we support from `latest` to the tag that is *6 months* old from the `latest`.
15
16
16
-
### No framework
17
+
### Inference minimal base images
17
18
18
19
Framework version | CPU/GPU | Pre-installed packages | MCR Path
19
20
--- | --- | --- | --- |
20
21
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu18.04-py37-cpu-inference:latest`
21
22
NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu18.04-py37-cuda11.0.3-gpu-inference:latest`
22
23
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cpu-inference:latest`
23
-
24
-
### TensorFlow
25
-
26
-
Framework version | CPU/GPU | Pre-installed packages | MCR Path
27
-
--- | --- | --- | --- |
28
-
2.4 | CPU | numpy>=1.16.0 </br> pandas~=1.1.x | `mcr.microsoft.com/azureml/tensorflow-2.4-ubuntu18.04-py37-cpu-inference:latest`
0 commit comments