You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can import AI model endpoints deployed in Azure AI Foundry to your API Management instance. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
17
+
You can import AI model endpoints deployed in Azure AI Foundry to your API Management instance as APIs. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
18
18
19
19
Learn more about managing AI APIs in API Management:
20
20
@@ -27,7 +27,7 @@ API Management supports two client compatibility options for AI APIs. Choose the
27
27
28
28
***Azure AI** - Manage model endpoints in Azure AI Foundry that are exposed through the [Azure AI Model Inference API](/azure/ai-studio/reference/reference-model-inference-api).
29
29
30
-
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if your AI service includes models exposed through the Azure AI Model Inference API.
30
+
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI Service.
31
31
32
32
***Azure OpenAI Service** - Manage model endpoints deployed in Azure OpenAI Service.
33
33
@@ -58,7 +58,7 @@ To import an AI Foundry API to API Management:
58
58
59
59
:::image type="content" source="media/azure-ai-foundry-api/ai-foundry-api.png" alt-text="Screenshot of creating an OpenAI-compatible API in the portal." :::
60
60
1. On the **Select AI service** tab:
61
-
1. Select the **Subscription** in which to search for AI services such as Azure AI Foundry or Azure OpenAI Service. To get information about the model deployments in a service, select the **deployments** link next to the service name.
61
+
1. Select the **Subscription** in which to search for AI services. To get information about the model deployments in a service, select the **deployments** link next to the service name.
62
62
:::image type="content" source="media/azure-ai-foundry-api/deployments.png" alt-text="Screenshot of deployments for an AI service in the portal.":::
63
63
1. Select an AI service.
64
64
1. Select **Next**.
@@ -68,7 +68,7 @@ To import an AI Foundry API to API Management:
68
68
1. Optionally select one or more **Products** to associate with the API.
69
69
1. In **Client compatibility**, select either of the following based on the types of client you intend to support. See [Client compatibility options](#client-compatibility-options) for more information.
70
70
***Azure OpenAI** - Select this option if your clients only need to access Azure OpenAI Service model deployments.
71
-
***Azure AI** - Select this option if your clients need to access other model in Azure AI Foundry.
71
+
***Azure AI** - Select this option if your clients need to access other models in Azure AI Foundry.
72
72
1. Select **Next**.
73
73
74
74
:::image type="content" source="media/azure-ai-foundry-api/client-compatibility.png" alt-text="Screenshot of AI Foundry API configuration in the portal.":::
@@ -78,7 +78,7 @@ To import an AI Foundry API to API Management:
1. On the **Apply semantic caching** tab, optionally enter settings or accept defaults that define the policies to help optimize performance and reduce latency for the API:
80
80
*[Enable semantic caching of responses](azure-openai-enable-semantic-caching.md)
81
-
1. On the **AI content safety**, optionally enter settings or accept defaults to configure the Azure AI Content Safety service checks for API requests:
81
+
1. On the **AI content safety**, optionally enter settings or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
82
82
*[Enforce content safety checks on LLM requests](llm-content-safety-policy.md)
83
83
1. Select **Review**.
84
84
1. After settings are validated, select **Create**.
You can import self-hosted AI model endpoints to your API Management instance. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
17
+
You can import self-hosted AI model endpoints to your API Management instance as APIs. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
18
18
19
19
Learn more about managing AI APIs in API Management:
20
20
@@ -40,9 +40,8 @@ API Management supports two types of self-hosted language model APIs. Choose the
40
40
41
41
## Import language model API using the portal
42
42
43
-
Use the following steps to import a language model API to API Management.
44
43
45
-
To import a language model API to API Management:
44
+
To import a self-hosted language model API to API Management:
46
45
47
46
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
48
47
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
@@ -66,7 +65,7 @@ To import a language model API to API Management:
1. On the **Apply semantic caching** tab, optionally enter settings or accept defaults that define the policies to help optimize performance and reduce latency for the API:
68
67
*[Enable semantic caching of responses](azure-openai-enable-semantic-caching.md)
69
-
1. On the **AI content safety**, optionally enter settings or accept defaults to configure the Azure AI Content Safety service checks for API requests:
68
+
1. On the **AI content safety**, optionally enter settings or accept defaults to configure the Azure AI Content Safety service to block prompts with unsafe content:
70
69
*[Enforce content safety checks on LLM requests](llm-content-safety-policy.md)
71
70
1. Select **Review**.
72
71
1. After settings are validated, select **Create**.
Copy file name to clipboardExpand all lines: includes/api-management-define-api-topics.md
+1-2Lines changed: 1 addition & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -21,9 +21,8 @@ ms.author: danlep
21
21
*[Import an Azure Function App API](../articles/api-management/import-function-app-as-api.md)
22
22
*[Import an Azure Logic App API](../articles/api-management/import-logic-app-as-api.md)
23
23
*[Import a Service Fabric service](/azure/service-fabric/service-fabric-tutorial-deploy-api-management)
24
-
*[Import an Azure OpenAI API](../articles/api-management/azure-openai-api-from-specification.md)
24
+
*[Import an Azure AI Foundry API](../articles/api-management/azure-ai-foundry-api.md)*[Import an Azure OpenAI API](../articles/api-management/azure-openai-api-from-specification.md)
25
25
*[Import an LLM API](../articles/api-management/openai-compatible-llm-api.md)
26
-
*[Import an Azure AI Foundry API](../articles/api-management/azure-ai-foundry-api.md)
27
26
*[Import an OData API](../articles/api-management/import-api-from-odata.md)
28
27
*[Import SAP OData metadata](../articles/api-management/sap-api.md)
29
28
*[Import a gRPC API](../articles/api-management/grpc-api.md)
0 commit comments