Skip to content

Commit 8d957b2

Browse files
Merge pull request #209289 from shohei1029/patch-6
Update how-to-inference-server-http.md
2 parents f92adee + a86fe5b commit 8d957b2

File tree

1 file changed

+29
-5
lines changed

1 file changed

+29
-5
lines changed

articles/machine-learning/how-to-inference-server-http.md

Lines changed: 29 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -133,7 +133,7 @@ The following steps explain how the Azure Machine Learning inference HTTP server
133133
There are two ways to use Visual Studio Code (VSCode) and [Python Extension](https://marketplace.visualstudio.com/items?itemName=ms-python.python) to debug with [azureml-inference-server-http](https://pypi.org/project/azureml-inference-server-http/) package.
134134
135135
1. User starts the AzureML Inference Server in a command line and use VSCode + Python Extension to attach to the process.
136-
1. User sets up the `launch.json` in the VSCode and start the AzureML Inference Server within VSCode.
136+
1. User sets up the `launch.json` in the VSCode and starts the AzureML Inference Server within VSCode.
137137
138138
**launch.json**
139139
```json
@@ -167,15 +167,39 @@ TypeError: register() takes 3 positional arguments but 4 were given
167167
168168
```
169169
170-
You have **Flask 2** installed in your python environment but are running a server (< 0.7.0) that does not support Flask 2. To resolve, please upgrade to the latest version of server.
170+
You have **Flask 2** installed in your python environment but are running a version of `azureml-inference-server-http` that doesn't support Flask 2. Support for Flask 2 is added in `azureml-inference-server-http>=0.7.0`, which is also in `azureml-defaults>=1.44`.
171171

172-
### 2. I encountered an ``ImportError`` or ``ModuleNotFoundError`` on modules ``opencensus``, ``jinja2``, ``MarkupSafe``, or ``click`` during startup like the following:
172+
1. If you're not using this package in an AzureML docker image, use the latest version of
173+
`azureml-inference-server-http` or `azureml-defaults`.
174+
175+
2. If you're using this package with an AzureML docker image, make sure you're using an image built in or after July,
176+
2022. The image version is available in the container logs. You should be able to find a log similar to below:
177+
178+
```
179+
2022-08-22T17:05:02,147738763+00:00 | gunicorn/run | AzureML Container Runtime Information
180+
2022-08-22T17:05:02,161963207+00:00 | gunicorn/run | ###############################################
181+
2022-08-22T17:05:02,168970479+00:00 | gunicorn/run |
182+
2022-08-22T17:05:02,174364834+00:00 | gunicorn/run |
183+
2022-08-22T17:05:02,187280665+00:00 | gunicorn/run | AzureML image information: openmpi4.1.0-ubuntu20.04, Materializaton Build:20220708.v2
184+
2022-08-22T17:05:02,188930082+00:00 | gunicorn/run |
185+
2022-08-22T17:05:02,190557998+00:00 | gunicorn/run |
186+
```
187+
188+
The build date of the image appears after "Materialization Build", which in the above example is `20220708`, or July 8, 2022. This image is compatible with Flask 2. If you don't see a banner like this in your container log, your image is out-of-date, and should be updated. If you're using a cuda image, and are unable to find a newer image, check if your image is deprecated in [AzureML-Containers](https://github.com/Azure/AzureML-Containers). If it is, you should be able to find replacements.
189+
190+
If this is an online endpoint, you can also find the logs under "Deployment logs" in the [online endpoint page in Azure Machine Learning studio](https://ml.azure.com/endpoints). If you deploy with SDK v1 and don't explicitly specify an image in your deployment configuration, it will default to using a version of `openmpi4.1.0-ubuntu20.04` that matches your local SDK toolset, which may not be the latest version of the image. For example, SDK 1.43 will default to using `openmpi4.1.0-ubuntu20.04:20220616`, which is incompatible. Make sure you use the latest SDK for your deployment.
191+
192+
If for some reason you're unable to update the image, you can temporarily avoid the issue by pinning `azureml-defaults==1.43` or `azureml-inference-server-http~=0.4.13`, which will install the older version server with `Flask 1.0.x`.
193+
194+
See also [Troubleshooting online endpoints deployment](how-to-troubleshoot-online-endpoints.md#error-resourcenotready).
195+
196+
### 2. I encountered an ``ImportError`` or ``ModuleNotFoundError`` on modules ``opencensus``, ``jinja2``, ``MarkupSafe``, or ``click`` during startup like the following message:
173197
174198
```bash
175199
ImportError: cannot import name 'Markup' from 'jinja2'
176200
```
177201
178-
Older versions (<= 0.4.10) of the server did not pin Flask's dependency to compatible versions. This is fixed in the latest version of the server.
202+
Older versions (<= 0.4.10) of the server didn't pin Flask's dependency to compatible versions. This problem is fixed in the latest version of the server.
179203
180204
### 3. Do I need to reload the server when changing the score script?
181205
@@ -188,4 +212,4 @@ The Azure Machine Learning inference server runs on Windows & Linux based operat
188212
## Next steps
189213
190214
* For more information on creating an entry script and deploying models, see [How to deploy a model using Azure Machine Learning](how-to-deploy-managed-online-endpoints.md).
191-
* Learn about [Prebuilt docker images for inference](concept-prebuilt-docker-images-inference.md)
215+
* Learn about [Prebuilt docker images for inference](concept-prebuilt-docker-images-inference.md)

0 commit comments

Comments
 (0)