Skip to content

Commit 0f01398

Browse files
authored
Merge pull request #204766 from shivanissambare/inference-updates
Inference image and server updates
2 parents b2f1650 + 5c85bfb commit 0f01398

File tree

4 files changed

+64
-55
lines changed

4 files changed

+64
-55
lines changed

articles/machine-learning/concept-prebuilt-docker-images-inference.md

Lines changed: 12 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -7,19 +7,16 @@ ms.service: machine-learning
77
ms.subservice: core
88
ms.author: ssambare
99
author: shivanissambare
10-
ms.date: 10/21/2021
10+
ms.date: 07/14/2022
1111
ms.topic: conceptual
1212
ms.reviewer: larryfr
13-
ms.custom: deploy, docker, prebuilt, curated environments
13+
ms.custom: deploy, docker, prebuilt
1414
---
1515

1616
# Prebuilt Docker images for inference
1717

1818
Prebuilt Docker container images for inference are used when deploying a model with Azure Machine Learning. The images are prebuilt with popular machine learning frameworks and Python packages. You can also extend the packages to add other packages by using one of the following methods:
1919

20-
* [Add Python packages](how-to-prebuilt-docker-images-inference-python-extensibility.md).
21-
* [Use prebuilt inference image as base for a new Dockerfile](how-to-extend-prebuilt-docker-image-inference.md). Using this method, you can install both **Python packages and apt packages**.
22-
2320
## Why should I use prebuilt images?
2421

2522
* Reduces model deployment latency.
@@ -29,9 +26,17 @@ Prebuilt Docker container images for inference are used when deploying a model w
2926

3027
## List of prebuilt Docker images for inference
3128

29+
> [!IMPORTANT]
30+
> The list provided below includes only **currently supported** inference docker images by Azure Machine Learning.
31+
3232
[!INCLUDE [list-of-inference-prebuilt-docker-images](../../includes/aml-inference-list-prebuilt-docker-images.md)]
3333

34+
## How to use inference prebuilt docker images?
35+
36+
[Check examples in the Azure machine learning GitHub repository](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online/custom-container)
37+
3438
## Next steps
3539

36-
* [Add Python packages to prebuilt images](how-to-prebuilt-docker-images-inference-python-extensibility.md).
37-
* [Use a prebuilt package as a base for a new Dockerfile](how-to-extend-prebuilt-docker-image-inference.md).
40+
* [Deploy and score a machine learning model by using an online endpoint](how-to-deploy-managed-online-endpoints.md)
41+
* [Learn more about custom containers](how-to-deploy-custom-container.md)
42+
* [azureml-examples GitHub repository](https://github.com/Azure/azureml-examples/tree/main/cli/endpoints/online)

articles/machine-learning/how-to-inference-server-http.md

Lines changed: 42 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ ms.service: machine-learning
1010
ms.subservice: core
1111
ms.topic: how-to
1212
ms.custom: inference server, local development, local debugging, devplatv2
13-
ms.date: 05/14/2021
13+
ms.date: 07/14/2022
1414
---
1515

1616
# Azure Machine Learning inference HTTP server (preview)
@@ -135,19 +135,57 @@ There are two ways to use Visual Studio Code (VSCode) and [Python Extension](htt
135135
1. User starts the AzureML Inference Server in a command line and use VSCode + Python Extension to attach to the process.
136136
1. User sets up the `launch.json` in the VSCode and start the AzureML Inference Server within VSCode.
137137
138+
**launch.json**
139+
```json
140+
{
141+
"name": "Debug score.py",
142+
"type": "python",
143+
"request": "launch",
144+
"module": "azureml_inference_server_http.amlserver",
145+
"args": [
146+
"--entry_script",
147+
"score.py"
148+
]
149+
}
150+
```
151+
138152
In both ways, user can set breakpoint and debug step by step.
139153
140154
## Frequently asked questions
141155
142-
### Do I need to reload the server when changing the score script?
156+
### 1. I encountered the following error during server startup:
157+
158+
```bash
159+
160+
TypeError: register() takes 3 positional arguments but 4 were given
161+
162+
File "/var/azureml-server/aml_blueprint.py", line 251, in register
163+
164+
super(AMLBlueprint, self).register(app, options, first_registration)
165+
166+
TypeError: register() takes 3 positional arguments but 4 were given
167+
168+
```
169+
170+
You have **Flask 2** installed in your python environment but are running a server (< 7.0.0) that does not support Flask 2. To resolve, please upgrade to the latest version of server.
171+
172+
### 2. I encountered an ``ImportError`` or ``ModuleNotFoundError`` on modules ``opencensus``, ``jinja2``, ``MarkupSafe``, or ``click`` during startup like the following:
173+
174+
```bash
175+
ImportError: cannot import name 'Markup' from 'jinja2'
176+
```
177+
178+
Older versions (<= 0.4.10) of the server did not pin Flask's dependency to compatible versions. This is fixed in the latest version of the server.
179+
180+
### 3. Do I need to reload the server when changing the score script?
143181

144182
After changing your scoring script (`score.py`), stop the server with `ctrl + c`. Then restart it with `azmlinfsrv --entry_script score.py`.
145183

146-
### Which OS is supported?
184+
### 4. Which OS is supported?
147185

148186
The Azure Machine Learning inference server runs on Windows & Linux based operating systems.
149187

150188
## Next steps
151189

152-
* For more information on creating an entry script and deploying models, see [How to deploy a model using Azure Machine Learning](how-to-deploy-and-where.md).
190+
* For more information on creating an entry script and deploying models, see [How to deploy a model using Azure Machine Learning](how-to-deploy-managed-online-endpoints.md).
153191
* Learn about [Prebuilt docker images for inference](concept-prebuilt-docker-images-inference.md)

articles/machine-learning/resource-curated-environments.md

Lines changed: 1 addition & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ This article lists the curated environments with latest framework versions in Az
2929
>[!IMPORTANT]
3030
> To view more information about curated environment packages and versions, visit the Environments tab in the Azure Machine Learning [studio](./how-to-manage-environments-in-studio.md).
3131
32-
## Training curated environments
32+
## Curated environments
3333

3434
### PyTorch
3535

@@ -89,9 +89,5 @@ Azure ML pipeline training workflows that use AutoML automatically selects a cur
8989

9090
For more information on AutoML and Azure ML pipelines, see [use automated ML in an Azure Machine Learning pipeline in Python](how-to-use-automlstep-in-pipelines.md).
9191

92-
## Inference curated environments and prebuilt docker images
93-
94-
[!INCLUDE [list-of-inference-prebuilt-docker-images](../../includes/aml-inference-list-prebuilt-docker-images.md)]
95-
9692
## Support
9793
Version updates for supported environments, including the base images they reference, are released every two weeks to address vulnerabilities no older than 30 days. Based on usage, some environments may be deprecated (hidden from the product but usable) to support more common machine learning scenarios.

includes/aml-inference-list-prebuilt-docker-images.md

Lines changed: 9 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -7,47 +7,17 @@ ms.service: machine-learning
77
ms.author: ssambare
88
ms.custom: "include file"
99
ms.topic: "include"
10-
ms.date: 10/07/2021
10+
ms.date: 07/14/2022
1111
---
1212

1313
* All the docker images run as non-root user.
14-
* We recommend using `latest` tag for docker images. Prebuilt docker images for inference are published to Microsoft container registry (MCR), to query list of tags available, follow [instructions on their GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
14+
* We recommend using `latest` tag for docker images. Prebuilt docker images for inference are published to Microsoft container registry (MCR), to query list of tags available, follow [instructions on the GitHub repository](https://github.com/microsoft/ContainerRegistry#browsing-mcr-content).
15+
* If you want to use a specific tag for any inference docker image, we support from `latest` to the tag that is *6 months* old from the `latest`.
1516

16-
### TensorFlow
17+
### Inference minimal base images
1718

18-
Framework version | CPU/GPU | Pre-installed packages | MCR Path | Curated environment
19-
--- | --- | --- | --- | --- |
20-
1.15 | CPU | pandas==0.25.1 </br> numpy=1.20.1 | `mcr.microsoft.com/azureml/tensorflow-1.15-ubuntu18.04-py37-cpu-inference:latest` | AzureML-tensorflow-1.15-ubuntu18.04-py37-cpu-inference |
21-
2.4 | CPU | numpy>=1.16.0 </br> pandas~=1.1.x | `mcr.microsoft.com/azureml/tensorflow-2.4-ubuntu18.04-py37-cpu-inference:latest` | AzureML-tensorflow-2.4-ubuntu18.04-py37-cpu-inference |
22-
2.4 | GPU | numpy >= 1.16.0 </br> pandas~=1.1.x </br> CUDA==11.0.3 </br> CuDNN==8.0.5.39 | `mcr.microsoft.com/azureml/tensorflow-2.4-ubuntu18.04-py37-cuda11.0.3-gpu-inference:latest` | AzureML-tensorflow-2.4-ubuntu18.04-py37-cuda11.0.3-gpu-inference |
23-
24-
### PyTorch
25-
26-
Framework version | CPU/GPU | Pre-installed packages | MCR Path | Curated environment
27-
--- | --- | --- | --- | --- |
28-
1.6 | CPU | numpy==1.20.1 </br> pandas==0.25.1 | `mcr.microsoft.com/azureml/pytorch-1.6-ubuntu18.04-py37-cpu-inference:latest` | AzureML-pytorch-1.6-ubuntu18.04-py37-cpu-inference |
29-
1.7 | CPU | numpy>=1.16.0 </br> pandas~=1.1.x | `mcr.microsoft.com/azureml/pytorch-1.7-ubuntu18.04-py37-cpu-inference:latest` | AzureML-pytorch-1.7-ubuntu18.04-py37-cpu-inference |
30-
31-
### SciKit-Learn
32-
33-
Framework version | CPU/GPU | Pre-installed packages | MCR Path | Curated environment
34-
--- | --- | --- | --- | --- |
35-
0.24.1 | CPU | scikit-learn==0.24.1 </br> numpy>=1.16.0 </br> pandas~=1.1.x | `mcr.microsoft.com/azureml/sklearn-0.24.1-ubuntu18.04-py37-cpu-inference:latest` | AzureML-sklearn-0.24.1-ubuntu18.04-py37-cpu-inference |
36-
37-
### ONNX Runtime
38-
39-
Framework version | CPU/GPU | Pre-installed packages | MCR Path | Curated environment
40-
--- | --- | --- | --- | --- |
41-
1.6 | CPU | numpy>=1.16.0 </br> pandas~=1.1.x | `mcr.microsoft.com/azureml/onnxruntime-1.6-ubuntu18.04-py37-cpu-inference:latest` |AzureML-onnxruntime-1.6-ubuntu18.04-py37-cpu-inference |
42-
43-
### XGBoost
44-
45-
Framework version | CPU/GPU | Pre-installed packages | MCR Path | Curated environment
46-
--- | --- | --- | --- | --- |
47-
0.9 | CPU | scikit-learn==0.23.2 </br> numpy==1.20.1 </br> pandas==0.25.1 | `mcr.microsoft.com/azureml/xgboost-0.9-ubuntu18.04-py37-cpu-inference:latest` | AzureML-xgboost-0.9-ubuntu18.04-py37-cpu-inference |
48-
49-
### No framework
50-
51-
Framework version | CPU/GPU | Pre-installed packages | MCR Path | Curated environment
52-
--- | --- | --- | --- | --- |
53-
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu18.04-py37-cpu-inference:latest` | AzureML-minimal-ubuntu18.04-py37-cpu-inference |
19+
Framework version | CPU/GPU | Pre-installed packages | MCR Path
20+
--- | --- | --- | --- |
21+
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu18.04-py37-cpu-inference:latest`
22+
NA | GPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu18.04-py37-cuda11.0.3-gpu-inference:latest`
23+
NA | CPU | NA | `mcr.microsoft.com/azureml/minimal-ubuntu20.04-py38-cpu-inference:latest`

0 commit comments

Comments
 (0)