You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Troubleshooting prebuilt docker images for inference
16
16
17
-
Learn how to troubleshoot problems you may see when using prebuilt docker images for inference with Azure Machine Learning.
17
+
Learn how to troubleshoot problems you might see when using prebuilt docker images for inference with Azure Machine Learning.
18
18
19
19
> [!IMPORTANT]
20
-
> Using [Python package extensibility for prebuilt Docker images](how-to-prebuilt-docker-images-inference-python-extensibility.md) with Azure Machine Learning is currently in preview. Preview functionality is provided "as-is", with no guarantee of support or service level agreement. For more information, see the [Supplemental terms of use for Microsoft Azure previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
20
+
> Using [Python package extensibility for prebuilt Docker images](how-to-prebuilt-docker-images-inference-python-extensibility.md) with Azure Machine Learning is currently in preview. Preview functionality is provided "as-is," with no guarantee of support or service level agreement. For more information, see the [Supplemental terms of use for Microsoft Azure previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
21
21
22
22
## Model deployment failed
23
23
24
-
If model deployment fails, you won't see logs in [Azure Machine Learning studio](https://ml.azure.com/) and `service.get_logs()`will return None.
25
-
If there is a problem in the init() function of score.py, `service.get_logs()`will return logs for the same.
24
+
If model deployment fails, there are no logs created in [Azure Machine Learning studio](https://ml.azure.com/) and `service.get_logs()`returns no logs.
25
+
If there's a problem in the init() function of score.py, `service.get_logs()`returns logs for the same.
26
26
27
-
So you'll need to run the container locally using one of the commands shown below and replace `<MCR-path>` with an image path. For a list of the images and paths, see [Prebuilt Docker images for inference](../concept-prebuilt-docker-images-inference.md).
27
+
You need to run the container locally using one of the following commands and replace `<MCR-path>` with an image path. For a list of the images and paths, see [Prebuilt Docker images for inference](../concept-prebuilt-docker-images-inference.md).
The local inference server allows you to quickly debug your entry script (`score.py`). In case the underlying score script has a bug, the server will fail to initialize or serve the model. Instead, it will throw an exception & the location where the issues occurred. [Learn more about Azure Machine Learning inference HTTP Server](../how-to-inference-server-http.md)
47
+
The local inference server allows you to quickly debug your entry script (`score.py`). In case the underlying score script has a bug, the server fails to initialize or serve the model. Instead, it throws an exception & the location where the issues occurred. [Learn more about Azure Machine Learning inference HTTP Server](../how-to-inference-server-http.md)
48
48
49
49
## For common model deployment issues
50
50
51
51
For problems when deploying a model from Azure Machine Learning to Azure Container Instances (ACI) or Azure Kubernetes Service (AKS), see [Troubleshoot model deployment](how-to-troubleshoot-deployment.md).
52
52
53
53
## init() or run() failing to write a file
54
54
55
-
HTTP server in our Prebuilt Docker Images run as *non-root user*, it may not have access right to all directories.
55
+
HTTP server in our Prebuilt Docker Images run as *non-root user*, it might not have access right to all directories.
56
56
Only write to directories you have access rights to. For example, the `/tmp` directory in the container.
57
57
58
58
## Extra Python packages not installed
59
59
60
60
* Check if there's a typo in the environment variable or file name.
61
61
* Check the container log to see if `pip install -r <your_requirements.txt>` is installed or not.
62
62
* Check if source directory is set correctly in the [inference config](/python/api/azureml-core/azureml.core.model.inferenceconfig#constructor) constructor.
63
-
* If installation not found and log says "file not found", check if the file name shown in the log is correct.
63
+
* If installation not found and log says "file not found," check if the file name shown in the log is correct.
64
64
* If installation started but failed or timed out, try to install the same `requirements.txt` locally with same Python and pip version in clean environment (that is, no cache directory; `pip install --no-cache-dir -r requriements.txt`). See if the problem can be reproduced locally.
65
65
66
66
## Mounting solution failed
@@ -73,7 +73,7 @@ Only write to directories you have access rights to. For example, the `/tmp` dir
73
73
74
74
## Building an image based on the prebuilt Docker image failed
75
75
76
-
* If failed during apt package installation, check if the user has been set to root before running the apt command? (Make sure switch back to non-root user)
76
+
* If failed during apt package installation, check if the user is set to root before running the apt command? (Make sure switch back to non-root user)
77
77
78
78
## Run doesn't complete on GPU local deployment
79
79
@@ -92,7 +92,7 @@ GPU base images can't be used for local deployment, unless the local deployment
92
92
/var/azureml-app
93
93
```
94
94
95
-
* If the `ENTRYPOINT`has been changed in the new built image, then the HTTP server and related components need to be loaded by `runsvdir /var/runit`
95
+
* If the `ENTRYPOINT`is changed in the new built image, then the HTTP server and related components need to be loaded by `runsvdir /var/runit`
0 commit comments