You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -29,17 +29,32 @@ Learn how to use [NVIDIA Triton Inference Server](https://aka.ms/nvidia-triton-d
29
29
30
30
Triton is multi-framework, open-source software that is optimized for inference. It supports popular machine learning frameworks like TensorFlow, ONNX Runtime, PyTorch, NVIDIA TensorRT, and more. It can be used for your CPU or GPU workloads.
31
31
32
-
In this article, you will learn how to deploy Triton and a model to a managed online endpoint. Information is provided on using both the CLI (command line) and Azure Machine Learning studio.
32
+
In this article, you will learn how to deploy Triton and a model to a managed online endpoint. Information is provided on using the CLI (command line), Python SDK v2, and Azure Machine Learning studio.
33
33
34
34
> [!NOTE]
35
35
> *[NVIDIA Triton Inference Server](https://aka.ms/nvidia-triton-docs) is an open-source third-party software that is integrated in Azure Machine Learning.
36
36
> * While Azure Machine Learning online endpoints are generally available, _using Triton with an online endpoint deployment is still in preview_.
* You must have additional Python packages installed for scoring and may install them with the code below. They include:
47
+
* Numpy - An array and numerical computing library
48
+
*[Triton Inference Server Client](https://github.com/triton-inference-server/client) - Facilitates requests to the Triton Inference Server
49
+
* Pillow - A library for image operations
50
+
* Gevent - A networking library used when connecting to the Triton Server
51
+
52
+
```azurecli
53
+
pip install numpy
54
+
pip install tritonclient[http]
55
+
pip install pillow
56
+
pip install gevent
57
+
```
43
58
44
59
* Access to NCv3-series VMs for your Azure subscription.
45
60
@@ -52,11 +67,57 @@ NVIDIA Triton Inference Server requires a specific model repository structure, w
52
67
53
68
The information in this document is based on using a model stored in ONNX format, so the directory structure of the model repository is `<model-repository>/<model-name>/1/model.onnx`. Specifically, this model performs image identification.
* You must have additional Python packages installed for scoring and may install them with the code below. They include:
80
+
* Numpy - An array and numerical computing library
81
+
*[Triton Inference Server Client](https://github.com/triton-inference-server/client) - Facilitates requests to the Triton Inference Server
82
+
* Pillow - A library for image operations
83
+
* Gevent - A networking library used when connecting to the Triton Server
84
+
85
+
```azurecli
86
+
pip install numpy
87
+
pip install tritonclient[http]
88
+
pip install pillow
89
+
pip install gevent
90
+
```
91
+
92
+
* Access to NCv3-series VMs for your Azure subscription.
93
+
94
+
> [!IMPORTANT]
95
+
> You may need to request a quota increase for your subscription before you can use this series of VMs. For more information, see [NCv3-series](../virtual-machines/ncv3-series.md).
96
+
97
+
[!INCLUDE [clone repo & set defaults](../../includes/machine-learning-cli-prepare.md)]
98
+
99
+
The information in this article is based on the [Deploy a model to online endpoints using Triton](https://github.com/Azure/azureml-examples/blob/main/sdk/endpoints/online/triton/single-model/online-endpoints-triton.ipynb) notebook contained in the [azureml-examples](https://github.com/azure/azureml-examples) repository. To run the commands locally without having to copy/paste files, clone the repo and then change directories to the `sdk/endpoints/online/triton/single-model/online-endpoints-triton.ipynb` directory in the repo:
* An Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://azure.microsoft.com/free/).
110
+
111
+
* An Azure Machine Learning workspace. If you don't have one, use the steps in [Manage Azure Machine Learning workspaces in the portal or with the Python SDK](https://docs.microsoft.com/en-us/azure/machine-learning/how-to-manage-workspace?tabs=azure-portal) to create one.
This section shows how you can deploy Triton to managed online endpoint using the Azure CLI with the Machine Learning extension (v2).
120
+
This section shows how you can deploy to a managed online endpoint using the Azure CLI with the Machine Learning extension (v2).
60
121
61
122
> [!IMPORTANT]
62
123
> For Triton no-code-deployment, **[testing via local endpoints](how-to-deploy-managed-online-endpoints.md#deploy-and-debug-locally-by-using-local-endpoints)** is currently not supported.
@@ -71,26 +132,13 @@ This section shows how you can deploy Triton to managed online endpoint using th
1. Install Python requirements using the following commands:
75
-
76
-
```azurecli
77
-
pip install numpy
78
-
pip install tritonclient[http]
79
-
pip install pillow
80
-
pip install gevent
81
-
```
82
-
83
135
1. Create a YAML configuration file for your endpoint. The following example configures the name and authentication mode of the endpoint. The one used in the following commands is located at `/cli/endpoints/online/triton/single-model/create-managed-endpoint.yml` in the azureml-examples repo you cloned earlier:
1. Create a YAML configuration file for the deployment. The following example configures a deployment named __blue__ to the endpoint created in the previous step. The one used in the following commands is located at `/cli/endpoints/online/triton/single-model/create-managed-deployment.yml` in the azureml-examples repo you cloned earlier:
141
+
1. Create a YAML configuration file for the deployment. The following example configures a deployment named __blue__ to the endpoint defined in the previous step. The one used in the following commands is located at `/cli/endpoints/online/triton/single-model/create-managed-deployment.yml` in the azureml-examples repo you cloned earlier:
94
142
95
143
> [!IMPORTANT]
96
144
> For Triton no-code-deployment (NCD) to work, setting **`type`** to **`triton_model`** is required, `type: triton_model`. For more information, see [CLI (v2) model YAML schema](reference-yaml-model.md).
@@ -99,11 +147,89 @@ This section shows how you can deploy Triton to managed online endpoint using th
This section shows how you can define a Triton deployment to deploy to a managed online endpoint using the Azure Machine Learning Python SDK (v2).
155
+
156
+
> [!IMPORTANT]
157
+
> For Triton no-code-deployment, **[testing via local endpoints](how-to-deploy-managed-online-endpoints.md#deploy-and-debug-locally-by-using-local-endpoints)** is currently not supported.
158
+
159
+
160
+
1. To connect to a workspace, we need identifier parameters - a subscription, resource group and workspace name.
161
+
162
+
```python
163
+
subscription_id = "<SUBSCRIPTION_ID>"
164
+
resource_group = "<RESOURCE_GROUP>"
165
+
workspace_name = "<AML_WORKSPACE_NAME>"
166
+
```
167
+
168
+
1. Use the following command to set the name of the endpoint that will be created. In this example, a random name is created for the endpoint:
1. We use these details above in the `MLClient` from `azure.ai.ml` to get a handle to the required Azure Machine Learning workspace. We use the default [default azure authentication](https://docs.microsoft.com/en-us/python/api/azure-identity/azure.identity.defaultazurecredential?view=azure-python) for this tutorial. Check the [configuration notebook](../../jobs/configuration.ipynb) for more details on how to configure credentials and connect to a workspace.
177
+
178
+
```python
179
+
from azure.ai.ml import MLClient
180
+
from azure.identity import DefaultAzureCredential
181
+
182
+
ml_client = MLClient(
183
+
DefaultAzureCredential(),
184
+
subscription_id,
185
+
resource_group,
186
+
workspace_name,
187
+
)
188
+
```
189
+
190
+
1. Create a `ManagedOnlineEndpoint` object to configure the endpoint. The following example configures the name and authentication mode of the endpoint.
191
+
192
+
```python
193
+
from azure.ai.ml.entities import ManagedOnlineEndpoint
1. Create a `ManagedOnlineDeployment` object to configure the deployment. The following example configures a deployment named __blue__ to the endpoint defined in the previous step and defines a local model inline.
199
+
200
+
```python
201
+
from azure.ai.ml.entities import ManagedOnlineDeployment, Model
0 commit comments