Skip to content

Commit 63cc64d

Browse files
committed
fix build warnings
1 parent 559dfa4 commit 63cc64d

File tree

1 file changed

+14
-13
lines changed

1 file changed

+14
-13
lines changed

articles/machine-learning/how-to-deploy-models-phi-3.md

Lines changed: 14 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.custom: references_regions, generated
1515
zone_pivot_groups: azure-ai-model-catalog-samples-chat
1616
---
1717

18-
# How to use Phi-3 family chat models
18+
# How to use Phi-3 family chat models with Azure Machine Learning
1919

2020
In this article, you learn about Phi-3 family chat models and how to use them.
2121
The Phi-3 family of small language models (SLMs) is a collection of instruction-tuned generative text models.
@@ -75,10 +75,10 @@ To use Phi-3 family chat models with Azure Machine Learning, you need the follow
7575

7676
Phi-3 family chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
7777

78-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
78+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
7979

8080
> [!div class="nextstepaction"]
81-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
81+
> [Deploy models as serverless API endpoints](how-to-deploy-models-serverless.md)
8282
8383
**Deployment to a self-hosted managed compute**
8484

@@ -87,7 +87,7 @@ Phi-3 family chat models can be deployed to our self-hosted managed inference so
8787
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
8888

8989
> [!div class="nextstepaction"]
90-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
90+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
9191
9292
### The inference package installed
9393

@@ -398,10 +398,11 @@ To use Phi-3 family chat models with Azure Machine Learning studio, you need the
398398

399399
Phi-3 family chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
400400

401-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
401+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
402402

403403
> [!div class="nextstepaction"]
404-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
404+
> [Deploy models as serverless API endpoints](how-to-deploy-models-serverless.md)
405+
405406

406407
**Deployment to a self-hosted managed compute**
407408

@@ -410,7 +411,7 @@ Phi-3 family chat models can be deployed to our self-hosted managed inference so
410411
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
411412

412413
> [!div class="nextstepaction"]
413-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
414+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
414415
415416
### The inference package installed
416417

@@ -744,10 +745,10 @@ To use Phi-3 family chat models with Azure Machine Learning studio, you need the
744745
745746
Phi-3 family chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
746747
747-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
748+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
748749
749750
> [!div class="nextstepaction"]
750-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
751+
> [Deploy models as serverless API endpoints](how-to-deploy-models-serverless.md)
751752
752753
**Deployment to a self-hosted managed compute**
753754
@@ -756,7 +757,7 @@ Phi-3 family chat models can be deployed to our self-hosted managed inference so
756757
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
757758
758759
> [!div class="nextstepaction"]
759-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
760+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
760761
761762
### The inference package installed
762763
@@ -1102,10 +1103,10 @@ To use Phi-3 family chat models with Azure Machine Learning studio, you need the
11021103
11031104
Phi-3 family chat models can be deployed to serverless API endpoints with pay-as-you-go billing. This kind of deployment provides a way to consume models as an API without hosting them on your subscription, while keeping the enterprise security and compliance that organizations need.
11041105
1105-
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](deploy-models-serverless.md).
1106+
Deployment to a serverless API endpoint doesn't require quota from your subscription. If your model isn't deployed already, use the Azure Machine Learning studio, Azure Machine Learning SDK for Python, the Azure CLI, or ARM templates to [deploy the model as a serverless API](how-to-deploy-models-serverless.md).
11061107
11071108
> [!div class="nextstepaction"]
1108-
> [Deploy the model to serverless API endpoints](deploy-models-serverless.md)
1109+
> [Deploy models as serverless API endpoints](how-to-deploy-models-serverless.md)
11091110
11101111
**Deployment to a self-hosted managed compute**
11111112
@@ -1114,7 +1115,7 @@ Phi-3 family chat models can be deployed to our self-hosted managed inference so
11141115
For deployment to a self-hosted managed compute, you must have enough quota in your subscription. If you don't have enough quota available, you can use our temporary quota access by selecting the option **I want to use shared quota and I acknowledge that this endpoint will be deleted in 168 hours.**
11151116

11161117
> [!div class="nextstepaction"]
1117-
> [Deploy the model to managed compute](../concepts/deployments-overview.md)
1118+
> [Deploy the model to managed compute](concept-model-catalog.md#deploy-models-for-inference-with-managed-compute)
11181119

11191120
### A REST client
11201121

0 commit comments

Comments
 (0)