You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/api-management/azure-ai-foundry-api.md
+5-5Lines changed: 5 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,16 +27,16 @@ API Management supports two client compatibility options for AI APIs. Choose the
27
27
28
28
***Azure AI** - Manage model endpoints in Azure AI Foundry that are exposed through the [Azure AI Model Inference API](/azure/ai-studio/reference/reference-model-inference-api).
29
29
30
-
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI Service.
30
+
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
31
31
32
-
***Azure OpenAI Service** - Manage model endpoints deployed in Azure OpenAI Service.
32
+
***Azure OpenAI** - Manage model endpoints deployed in Azure OpenAI.
33
33
34
-
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. Deployment name is passed in the request path. Use this option if your AI service only includes Azure OpenAI Service model deployments.
34
+
Clients call the deployment at an `/openai` endpoint such as `/openai/deployments/my-deployment/chat/completions`. Deployment name is passed in the request path. Use this option if your AI service only includes Azure OpenAI model deployments.
35
35
36
36
## Prerequisites
37
37
38
38
- An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
39
-
- An Azure AI service in your subscription with one or more models deployed. Examples include models deployed in Azure AI Foundry or Azure OpenAI Service.
39
+
- An Azure AI service in your subscription with one or more models deployed. Examples include Azure OpenAI or other models deployed in Azure AI Foundry.
40
40
41
41
## Import AI Foundry API using the portal
42
42
@@ -67,7 +67,7 @@ To import an AI Foundry API to API Management:
67
67
1. In **Base path**, enter a path that your API Management instance uses to access the deployment endpoint.
68
68
1. Optionally select one or more **Products** to associate with the API.
69
69
1. In **Client compatibility**, select either of the following based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
70
-
***Azure OpenAI** - Select this option if your clients only need to access Azure OpenAI Service model deployments.
70
+
***Azure OpenAI** - Select this option if your clients only need to access Azure OpenAI model deployments.
71
71
***Azure AI** - Select this option if your clients need to access other models in Azure AI Foundry.
0 commit comments