You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/concepts/endpoints.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -41,8 +41,8 @@ To learn more about how to create deployments see [Add and configure model deplo
41
41
Azure AI Foundry Services (formerly known Azure AI Services) expose multiple endpoints depending on the type of work you're looking for:
42
42
43
43
> [!div class="checklist"]
44
-
> * Azure OpenAI endpoint (usually with the form `https://<resource-name>.services.ai.azure.com/models`)
45
-
> * Azure AI inference endpoint (usually with the form `https://<resource-name>.openai.azure.com`)
44
+
> * Azure AI inference endpoint (usually with the form `https://<resource-name>.services.ai.azure.com/models`)
45
+
> * Azure OpenAI endpoint (usually with the form `https://<resource-name>.openai.azure.com`)
46
46
47
47
The **Azure AI inference endpoint** allows customers to use a single endpoint with the same authentication and schema to generate inference for the deployed models in the resource. All the models support this capability. This endpoint follows the [Azure AI Model Inference API](.././reference/reference-model-inference-api.md).
0 commit comments