Skip to content

Commit e71bec3

Browse files
authored
Update articles/ai-foundry/how-to/deploy-nvidia-inference-microservice.md
1 parent 1caede0 commit e71bec3

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

articles/ai-foundry/how-to/deploy-nvidia-inference-microservice.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,7 @@ Get improved TCO (total cost of ownership) and performance with NVIDIA NIMs offe
7676

7777
1. Select the checkbox to acknowledge understanding of pricing and terms of use, and then, select **Deploy**.
7878

79-
:::image type="content" source="../media/how-to/deploy-nvidia-inference-microservice/deploy-nvidia-inference-microservice.png" alt-text="A screenshot showing the deploy model button in the deployment wizard." lightbox="../media/how-to/deploy-nvidia-inference-microservice/deploy-nim.png":::
79+
:::image type="content" source="../media/how-to/deploy-nvidia-inference-microservice/deploy-nvidia-inference-microservice.png" alt-text="A screenshot showing the deploy model button in the deployment wizard." lightbox="../media/how-to/deploy-nvidia-inference-microservice/deploy-nvidia-inference-microservice.png":::
8080

8181

8282
## Consume NVIDIA NIM deployments

0 commit comments

Comments
 (0)