You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In some cases, the [prebuilt Docker images for model inference](concept-prebuilt-docker-images-inference.md) and [extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) solutions for Azure Machine Learning may not meet your inference service needs.
18
+
In some cases, the [prebuilt Docker images for model inference](../concept-prebuilt-docker-images-inference.md) and [extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) solutions for Azure Machine Learning may not meet your inference service needs.
19
19
20
20
In this case, you can use a Dockerfile to create a new image, using one of the prebuilt images as the starting point. By extending from an existing prebuilt Docker image, you can use the Azure Machine Learning network stack and libraries without creating an image from scratch.
21
21
@@ -26,7 +26,7 @@ Using a Dockerfile allows for full customization of the image before deployment.
26
26
The main tradeoff for this approach is that an extra image build will take place during deployment, which slows down the deployment process. If you can use the [Python package extensibility](./how-to-prebuilt-docker-images-inference-python-extensibility.md) method, deployment will be faster.
27
27
## Prerequisites
28
28
29
-
* An Azure Machine Learning workspace. For a tutorial on creating a workspace, see [Get started with Azure Machine Learning](quickstart-create-resources.md).
29
+
* An Azure Machine Learning workspace. For a tutorial on creating a workspace, see [Get started with Azure Machine Learning](../quickstart-create-resources.md).
30
30
* Familiarity with authoring a [Dockerfile](https://docs.docker.com/engine/reference/builder/).
31
31
* Either a local working installation of [Docker](https://www.docker.com/), including the `docker` CLI, **OR** an Azure Container Registry (ACR) associated with your Azure Machine Learning workspace.
The [prebuilt Docker images for model inference](concept-prebuilt-docker-images-inference.md) contain packages for popular machine learning frameworks. There are two methods that can be used to add Python packages __without rebuilding the Docker image__:
21
21
@@ -32,8 +32,8 @@ The [prebuilt Docker images for model inference](concept-prebuilt-docker-images-
32
32
33
33
## Prerequisites
34
34
35
-
* An Azure Machine Learning workspace. For a tutorial on creating a workspace, see [Get started with Azure Machine Learning](quickstart-create-resources.md).
36
-
* Familiarity with using Azure Machine Learning [environments](how-to-use-environments.md).
35
+
* An Azure Machine Learning workspace. For a tutorial on creating a workspace, see [Get started with Azure Machine Learning](../quickstart-create-resources.md).
36
+
* Familiarity with using Azure Machine Learning [environments](../how-to-use-environments.md).
37
37
* Familiarity with [Where and how to deploy models](how-to-deploy-and-where.md) with Azure Machine Learning.
38
38
39
39
<aid="dynamic"></a>
@@ -46,7 +46,7 @@ To extend your prebuilt docker container image through a requirements.txt, follo
46
46
47
47
1. Create a `requirements.txt` file alongside your `score.py` script.
48
48
2. Add **all** of your required packages to the `requirements.txt` file.
49
-
3. Set the `AZUREML_EXTRA_REQUIREMENTS_TXT` environment variable in your Azure Machine Learning [environment](how-to-use-environments.md) to the location of `requirements.txt` file.
49
+
3. Set the `AZUREML_EXTRA_REQUIREMENTS_TXT` environment variable in your Azure Machine Learning [environment](../how-to-use-environments.md) to the location of `requirements.txt` file.
50
50
51
51
Once deployed, the packages will automatically be restored for your score script.
52
52
@@ -166,7 +166,7 @@ Here are some things that may cause this problem:
166
166
167
167
## Best Practices
168
168
169
-
* Refer to the [Load registered model](./v1/how-to-deploy-advanced-entry-script.md#load-registered-models) docs. When you register a model directory, don't include your scoring script, your mounted dependencies directory, or `requirements.txt` within that directory.
169
+
* Refer to the [Load registered model](how-to-deploy-advanced-entry-script.md#load-registered-models) docs. When you register a model directory, don't include your scoring script, your mounted dependencies directory, or `requirements.txt` within that directory.
170
170
171
171
172
172
* For more information on how to load a registered or local model, see [Where and how to deploy](how-to-deploy-and-where.md?tabs=azcli#define-a-dummy-entry-script).
@@ -180,6 +180,6 @@ For example, if both the requirements.txt and score script is in **my_folder**,
180
180
181
181
## Next steps
182
182
183
-
To learn more about deploying a model, see [How to deploy a model](./v1/how-to-deploy-and-where.md).
183
+
To learn more about deploying a model, see [How to deploy a model](how-to-deploy-and-where.md).
184
184
185
185
To learn how to troubleshoot prebuilt docker image deployments, see [how to troubleshoot prebuilt Docker image deployments](how-to-troubleshoot-prebuilt-docker-image-inference.md).
Copy file name to clipboardExpand all lines: articles/machine-learning/v1/how-to-troubleshoot-prebuilt-docker-image-inference.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -24,7 +24,7 @@ Learn how to troubleshoot problems you may see when using prebuilt docker images
24
24
If model deployment fails, you won't see logs in [Azure Machine Learning studio](https://ml.azure.com/) and `service.get_logs()` will return None.
25
25
If there is a problem in the init() function of score.py, `service.get_logs()` will return logs for the same.
26
26
27
-
So you'll need to run the container locally using one of the commands shown below and replace `<MCR-path>` with an image path. For a list of the images and paths, see [Prebuilt Docker images for inference](concept-prebuilt-docker-images-inference.md).
27
+
So you'll need to run the container locally using one of the commands shown below and replace `<MCR-path>` with an image path. For a list of the images and paths, see [Prebuilt Docker images for inference](../concept-prebuilt-docker-images-inference.md).
The local inference server allows you to quickly debug your entry script (`score.py`). In case the underlying score script has a bug, the server will fail to initialize or serve the model. Instead, it will throw an exception & the location where the issues occurred. [Learn more about Azure Machine Learning inference HTTP Server](how-to-inference-server-http.md)
47
+
The local inference server allows you to quickly debug your entry script (`score.py`). In case the underlying score script has a bug, the server will fail to initialize or serve the model. Instead, it will throw an exception & the location where the issues occurred. [Learn more about Azure Machine Learning inference HTTP Server](../how-to-inference-server-http.md)
48
48
49
49
## For common model deployment issues
50
50
51
-
For problems when deploying a model from Azure Machine Learning to Azure Container Instances (ACI) or Azure Kubernetes Service (AKS), see [Troubleshoot model deployment](./v1/how-to-troubleshoot-deployment.md).
51
+
For problems when deploying a model from Azure Machine Learning to Azure Container Instances (ACI) or Azure Kubernetes Service (AKS), see [Troubleshoot model deployment](how-to-troubleshoot-deployment.md).
0 commit comments