Skip to content

Commit 74eae46

Browse files
authored
Merge pull request #210947 from xanwal/alwallace/debug-endpoints-locally-vscode
Updated debug MOE with visual studio code article
2 parents 670b7de + 0bc061b commit 74eae46

File tree

1 file changed

+225
-0
lines changed

1 file changed

+225
-0
lines changed

articles/machine-learning/how-to-debug-managed-online-endpoints-visual-studio-code.md

Lines changed: 225 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,13 @@ ms.devlang: azurecli
1818

1919
[!INCLUDE [cli v2](../../includes/machine-learning-cli-v2.md)]
2020

21+
[!INCLUDE [sdk v2](../../includes/machine-learning-sdk-v2.md)]
22+
23+
> [!IMPORTANT]
24+
> SDK v2 is currently in public preview.
25+
> The preview version is provided without a service level agreement, and it's not recommended for production workloads. Certain features might not be supported or might have constrained capabilities.
26+
> For more information, see [Supplemental Terms of Use for Microsoft Azure Previews](https://azure.microsoft.com/support/legal/preview-supplemental-terms/).
27+
2128
Learn how to use the Visual Studio Code (VS Code) debugger to test and debug online endpoints locally before deploying them to Azure.
2229

2330
Azure Machine Learning local endpoints help you test and debug your scoring script, environment configuration, code configuration, and machine learning model locally.
@@ -42,6 +49,8 @@ The following table provides an overview of scenarios to help you choose what wo
4249

4350
## Prerequisites
4451

52+
# [Azure CLI](#tab/cli)
53+
4554
This guide assumes you have the following items installed locally on your PC.
4655

4756
- [Docker](https://docs.docker.com/engine/install/)
@@ -74,8 +83,57 @@ az account set --subscription <subscription>
7483
az configure --defaults workspace=<workspace> group=<resource-group> location=<location>
7584
```
7685

86+
# [Python](#tab/python)
87+
[!INCLUDE [sdk v2](../../includes/machine-learning-sdk-v2.md)]
88+
89+
This guide assumes you have the following items installed locally on your PC.
90+
91+
- [Docker](https://docs.docker.com/engine/install/)
92+
- [VS Code](https://code.visualstudio.com/#alt-downloads)
93+
- [Azure CLI](/cli/azure/install-azure-cli)
94+
- [Azure CLI `ml` extension (v2)](how-to-configure-cli.md)
95+
- [Azure ML Python SDK (v2)](https://aka.ms/sdk-v2-install)
96+
97+
For more information, see the guide on [how to prepare your system to deploy managed online endpoints](how-to-deploy-managed-online-endpoints.md#prepare-your-system).
98+
99+
The examples in this article can be found in the [Debug online endpoints locally in Visual Studio Code](https://github.com/Azure/azureml-examples/blob/main/sdk/endpoints/online/managed/debug-online-endpoints-locally-in-visual-studio-code.ipynb) notebook within the[azureml-examples](https://github.com/azure/azureml-examples) repository. To run the code locally, clone the repo and then change directories to the notebook's parent directory `sdk/endpoints/online/managed`.
100+
101+
```azurecli
102+
git clone https://github.com/Azure/azureml-examples --depth 1
103+
cd azureml-examples
104+
cd sdk/endpoints/online/managed
105+
```
106+
107+
Import the required modules:
108+
109+
```python
110+
from azure.ai.ml import MLClient
111+
from azure.ai.ml.entities import (
112+
ManagedOnlineEndpoint,
113+
ManagedOnlineDeployment,
114+
Model,
115+
CodeConfiguration,
116+
Environment,
117+
)
118+
from azure.identity import DefaultAzureCredential, AzureCliCredential
119+
```
120+
121+
Set up variables for the workspace and endpoint:
122+
123+
```python
124+
subscription_id = "<SUBSCRIPTION_ID>"
125+
resource_group = "<RESOURCE_GROUP>"
126+
workspace_name = "<AML_WORKSPACE_NAME>"
127+
128+
endpoint_name = "<ENDPOINT_NAME>"
129+
```
130+
131+
---
132+
77133
## Launch development container
78134

135+
# [Azure CLI](#tab/cli)
136+
79137
Azure Machine Learning local endpoints use Docker and VS Code development containers (dev container) to build and configure a local debugging environment. With dev containers, you can take advantage of VS Code features from inside a Docker container. For more information on dev containers, see [Create a development container](https://code.visualstudio.com/docs/remote/create-dev-container).
80138

81139
To debug online endpoints locally in VS Code, use the `--vscode-debug` flag when creating or updating and Azure Machine Learning online deployment. The following command uses a deployment example from the examples repo:
@@ -104,6 +162,74 @@ You'll use a few VS Code extensions to debug your deployments in the dev contain
104162
> [!IMPORTANT]
105163
> Before starting your debug session, make sure that the VS Code extensions have finished installing in your dev container.
106164
165+
166+
# [Python](#tab/python)
167+
168+
Azure Machine Learning local endpoints use Docker and VS Code development containers (dev container) to build and configure a local debugging environment. With dev containers, you can take advantage of VS Code features from inside a Docker container. For more information on dev containers, see [Create a development container](https://code.visualstudio.com/docs/remote/create-dev-container).
169+
170+
Get a handle to the workspace:
171+
172+
```python
173+
credential = AzureCliCredential()
174+
ml_client = MLClient(
175+
credential,
176+
subscription_id=subscription_id,
177+
resource_group_name=resource_group,
178+
workspace_name=workspace_name,
179+
)
180+
```
181+
182+
To debug online endpoints locally in VS Code, set the `vscode-debug` and `local` flags when creating or updating an Azure Machine Learning online deployment. The following code mirrors a deployment example from the examples repo:
183+
184+
```python
185+
deployment = ManagedOnlineDeployment(
186+
name="blue",
187+
endpoint_name=endpoint_name,
188+
model=Model(path="../model-1/model"),
189+
code_configuration=CodeConfiguration(
190+
code="../model-1/onlinescoring", scoring_script="score.py"
191+
),
192+
environment=Environment(
193+
conda_file="../model-1/environment/conda.yml",
194+
image="mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04:20210727.v1",
195+
),
196+
instance_type="Standard_DS2_v2",
197+
instance_count=1,
198+
)
199+
200+
deployment = ml_client.online_deployments.begin_create_or_update(
201+
deployment,
202+
local=True,
203+
vscode_debug=True,
204+
)
205+
```
206+
207+
> [!IMPORTANT]
208+
> On Windows Subsystem for Linux (WSL), you'll need to update your PATH environment variable to include the path to the VS Code executable or use WSL interop. For more information, see [Windows interoperability with Linux](/windows/wsl/interop).
209+
210+
A Docker image is built locally. Any environment configuration or model file errors are surfaced at this stage of the process.
211+
212+
> [!NOTE]
213+
> The first time you launch a new or updated dev container it can take several minutes.
214+
215+
Once the image successfully builds, your dev container opens in a VS Code window.
216+
217+
You'll use a few VS Code extensions to debug your deployments in the dev container. Azure Machine Learning automatically installs these extensions in your dev container.
218+
219+
- Inference Debug
220+
- [Pylance](https://marketplace.visualstudio.com/items?itemName=ms-python.vscode-pylance)
221+
- [Jupyter](https://marketplace.visualstudio.com/items?itemName=ms-toolsai.jupyter)
222+
- [Python](https://marketplace.visualstudio.com/items?itemName=ms-python.python)
223+
224+
> [!IMPORTANT]
225+
> Before starting your debug session, make sure that the VS Code extensions have finished installing in your dev container.
226+
227+
---
228+
229+
230+
231+
---
232+
107233
## Start debug session
108234

109235
Once your environment is set up, use the VS Code debugger to test and debug your deployment locally.
@@ -136,6 +262,8 @@ For more information on the VS Code debugger, see [Debugging in VS Code](https:/
136262

137263
## Debug your endpoint
138264

265+
# [Azure CLI](#tab/cli)
266+
139267
Now that your application is running in the debugger, try making a prediction to debug your scoring script.
140268

141269
Use the `ml` extension `invoke` command to make a request to your local endpoint.
@@ -180,8 +308,66 @@ In this case, `<REQUEST-FILE>` is a JSON file that contains input data samples f
180308
181309
At this point, any breakpoints in your `run` function are caught. Use the debug actions to step through your code. For more information on debug actions, see the [debug actions guide](https://code.visualstudio.com/Docs/editor/debugging#_debug-actions).
182310
311+
312+
# [Python](#tab/python)
313+
314+
Now that your application is running in the debugger, try making a prediction to debug your scoring script.
315+
316+
Use the `invoke` method on your `MLClient` object to make a request to your local endpoint.
317+
318+
```python
319+
endpoint = ml_client.online_endpoints.get(name=endpoint_name, local=True)
320+
321+
request_file_path = "../model-1/sample-request.json"
322+
323+
endpoint.invoke(endpoint_name, request_file_path, local=True)
324+
```
325+
326+
In this case, `<REQUEST-FILE>` is a JSON file that contains input data samples for the model to make predictions on similar to the following JSON:
327+
328+
```json
329+
{"data": [
330+
[1,2,3,4,5,6,7,8,9,10],
331+
[10,9,8,7,6,5,4,3,2,1]
332+
]}
333+
```
334+
335+
> [!TIP]
336+
> The scoring URI is the address where your endpoint listens for requests. The `as_dict` method of endpoint objects returns information similar to `show` in the Azure CLI. The endpoint object can be obtained through `.get`.
337+
>
338+
> ```python
339+
> endpoint = ml_client.online_endpoints.get(endpoint_name, local=True)
340+
> endpoint.as_dict()
341+
> ```
342+
>
343+
> The output should look similar to the following:
344+
>
345+
> ```json
346+
> {
347+
> "auth_mode": "aml_token",
348+
> "location": "local",
349+
> "name": "my-new-endpoint",
350+
> "properties": {},
351+
> "provisioning_state": "Succeeded",
352+
> "scoring_uri": "http://localhost:5001/score",
353+
> "tags": {},
354+
> "traffic": {},
355+
> "type": "online"
356+
>}
357+
>```
358+
>
359+
>The scoring URI can be found in the `scoring_uri` key.
360+
361+
At this point, any breakpoints in your `run` function are caught. Use the debug actions to step through your code. For more information on debug actions, see the [debug actions guide](https://code.visualstudio.com/Docs/editor/debugging#_debug-actions).
362+
363+
364+
---
365+
366+
183367
## Edit your endpoint
184368
369+
# [Azure CLI](#tab/cli)
370+
185371
As you debug and troubleshoot your application, there are scenarios where you need to update your scoring script and configurations.
186372
187373
To apply changes to your code:
@@ -200,6 +386,45 @@ az ml online-deployment update --file <DEPLOYMENT-YAML-SPECIFICATION-FILE> --loc
200386
201387
Once the updated image is built and your development container launches, use the VS Code debugger to test and troubleshoot your updated endpoint.
202388
389+
# [Python](#tab/python)
390+
391+
As you debug and troubleshoot your application, there are scenarios where you need to update your scoring script and configurations.
392+
393+
To apply changes to your code:
394+
395+
1. Update your code
396+
1. Restart your debug session using the `Developer: Reload Window` command in the command palette. For more information, see the [command palette documentation](https://code.visualstudio.com/docs/getstarted/userinterface#_command-palette).
397+
398+
> [!NOTE]
399+
> Since the directory containing your code and endpoint assets is mounted onto the dev container, any changes you make in the dev container are synced with your local file system.
400+
401+
For more extensive changes involving updates to your environment and endpoint configuration, use your `MLClient`'s `online_deployments.update` module/method. Doing so will trigger a full image rebuild with your changes.
402+
403+
```python
404+
new_deployment = ManagedOnlineDeployment(
405+
name="green",
406+
endpoint_name=endpoint_name,
407+
model=Model(path="../model-2/model"),
408+
code_configuration=CodeConfiguration(
409+
code="../model-2/onlinescoring", scoring_script="score.py"
410+
),
411+
environment=Environment(
412+
conda_file="../model-2/environment/conda.yml",
413+
image="mcr.microsoft.com/azureml/openmpi3.1.2-ubuntu18.04:20210727.v1",
414+
),
415+
instance_type="Standard_DS2_v2",
416+
instance_count=2,
417+
)
418+
419+
ml_client.online_deployments.update(new_deployment, local=True, vscode_debug=True)
420+
```
421+
422+
Once the updated image is built and your development container launches, use the VS Code debugger to test and troubleshoot your updated endpoint.
423+
424+
425+
426+
---
427+
203428
## Next steps
204429
205430
- [Deploy and score a machine learning model by using a managed online endpoint (preview)](how-to-deploy-managed-online-endpoints.md)

0 commit comments

Comments
 (0)