Skip to content

Commit 7946566

Browse files
Merge pull request #258669 from Blackmist/patch-46
Update how-to-deploy-with-triton.md
2 parents 40dc95f + 552aad0 commit 7946566

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/machine-learning/how-to-deploy-with-triton.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ Triton is multi-framework, open-source software that is optimized for inference.
2525
In this article, you will learn how to deploy Triton and a model to a [managed online endpoint](concept-endpoints-online.md#online-endpoints). Information is provided on using the CLI (command line), Python SDK v2, and Azure Machine Learning studio.
2626

2727
> [!NOTE]
28-
> Use of the NVIDIA Triton Inference Server container is governed by the [NVIDIA AI Enterprise Software license agreement](https://www.nvidia.com/en-us/data-center/products/nvidia-ai-enterprise/eula/) and can be used for 90 days without an enterprise product subscription. For more information, see [https://www.nvidia.com/en-us/data-center/nvidia-ai-enterprise-on-azure-ml/](https://www.nvidia.com/en-us/data-center/nvidia-ai-enterprise-on-azure-ml/).
28+
> Use of the NVIDIA Triton Inference Server container is governed by the [NVIDIA AI Enterprise Software license agreement](https://www.nvidia.com/en-us/data-center/products/nvidia-ai-enterprise/eula/) and can be used for 90 days without an enterprise product subscription. For more information, see [NVIDIA AI Enterprise on Azure Machine Learning](https://www.nvidia.com/en-us/data-center/azure-ml).
2929
3030
## Prerequisites
3131

0 commit comments

Comments
 (0)