Skip to content

Commit 71f4b11

Browse files
Merge pull request #301535 from MeeraDi/439236
Rebrand - AI Foundry terms
2 parents 64fb190 + 0e263a4 commit 71f4b11

7 files changed

+27
-27
lines changed

articles/api-management/api-management-authenticate-authorize-azure-openai.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -103,7 +103,7 @@ Following are steps to configure your API Management instance to use a managed i
103103
```
104104

105105
> [!TIP]
106-
> An alternative to using the `authentication-managed-identity` and `set-header` policies shown in this example is to configure a [backend](backends.md) resource that directs API requests to the Azure OpenAI Service endpoint. In the backend configuration, enable managed identity authentication to the Azure OpenAI Service. Azure API Management automates these steps when importing an API directly from Azure OpenAI Service. For more information, see [Import API from Azure OpenAI Service](azure-openai-api-from-specification.md#option-1-import-api-from-azure-openai-service).
106+
> An alternative to using the `authentication-managed-identity` and `set-header` policies shown in this example is to configure a [backend](backends.md) resource that directs API requests to the Azure OpenAI Service endpoint. In the backend configuration, enable managed identity authentication to the Azure OpenAI Service. Azure API Management automates these steps when importing an API directly from Azure OpenAI Service. For more information, see [Import API from Azure OpenAI Service](azure-openai-api-from-specification.md#option-1-import-api-from-azure-openai).
107107

108108
## OAuth 2.0 authorization using identity provider
109109

articles/api-management/azure-ai-foundry-api.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ Learn more about managing AI APIs in API Management:
2626

2727
API Management supports two client compatibility options for AI APIs. Choose the option suitable for your model deployment. The option determines how clients call the API and how the API Management instance routes requests to the AI service.
2828

29-
* **Azure AI** - Manage model endpoints in Azure AI Foundry that are exposed through the [Azure AI Model Inference API](/azure/ai-studio/reference/reference-model-inference-api).
29+
* **Azure AI** - Manage model endpoints in Azure AI Foundry that are exposed through the [ Azure AI Model Inference API](/azure/ai-studio/reference/reference-model-inference-api).
3030

3131
Clients call the deployment at a `/models` endpoint such as `/my-model/models/chat/completions`. Deployment name is passed in the request body. Use this option if you want flexibility to switch between models exposed through the Azure AI Model Inference API and those deployed in Azure OpenAI in Foundry Models.
3232

@@ -37,7 +37,8 @@ API Management supports two client compatibility options for AI APIs. Choose the
3737
## Prerequisites
3838

3939
- An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
40-
- An Azure AI service in your subscription with one or more models deployed. Examples include Azure OpenAI or other models deployed in Azure AI Foundry.
40+
41+
- An Azure AI service in your subscription with one or more models deployed. Examples include models deployed in Azure AI Foundry or Azure OpenAI.
4142

4243
## Import AI Foundry API using the portal
4344

articles/api-management/azure-openai-api-from-specification.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: Import an Azure OpenAI API as REST API - Azure API Management
3-
description: How to import an Azure OpenAI API as a REST API from the Azure OpenAI Service or from an OpenAPI specification.
3+
description: How to import an Azure OpenAI API as a REST API from the Azure OpenAI in Foundry Models or from an OpenAPI specification.
44
ms.service: azure-api-management
55
author: dlepow
66
ms.author: danlep
@@ -15,12 +15,12 @@ ms.custom: template-how-to, build-2024
1515

1616
[!INCLUDE [api-management-availability-all-tiers](../../includes/api-management-availability-all-tiers.md)]
1717

18-
You can import AI model endpoints deployed in [Azure OpenAI Service](/azure/ai-services/openai/overview) to your API Management instance as a REST API. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
18+
You can import AI model endpoints deployed in [Azure OpenAI in Foundry Models](/azure/ai-services/openai/overview) to your API Management instance as a REST API. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
1919

2020

2121
This article shows two options to import an Azure OpenAI API into an Azure API Management instance as a REST API:
2222

23-
- [Import an Azure OpenAI API directly from Azure OpenAI Service](#option-1-import-api-from-azure-openai-service) (recommended)
23+
- [Import an Azure OpenAI API directly from Azure OpenAI in Foundry Models](#option-1-import-api-from-azure-openai) (recommended)
2424
- [Download and add the OpenAPI specification](#option-2-add-an-openapi-specification-to-api-management) for Azure OpenAI and add it to API Management as an OpenAPI API.
2525

2626
Learn more about managing AI APIs in API Management:
@@ -30,36 +30,36 @@ Learn more about managing AI APIs in API Management:
3030
## Prerequisites
3131

3232
- An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
33-
- An Azure OpenAI resource with a model deployed. For more information about model deployment in Azure OpenAI service, see the [resource deployment guide](/azure/ai-services/openai/how-to/create-resource).
33+
- An Azure OpenAI resource with a model deployed. For more information about model deployment in Azure OpenAI, see the [resource deployment guide](/azure/ai-services/openai/how-to/create-resource).
3434

3535
Make a note of the ID (name) of the deployment. You'll need it when you test the imported API in API Management.
3636

3737
> [!NOTE]
38-
> API Management policies such as [azure-openai-token-limit](azure-openai-token-limit-policy.md) and [azure-openai-emit-token-metric](azure-openai-emit-token-metric-policy.md) are supported for certain API endpoints exposed through specific Azure OpenAI Service models. For more information, see [Supported Azure OpenAI Service models](azure-openai-token-limit-policy.md#supported-azure-openai-service-models).
38+
> API Management policies such as [azure-openai-token-limit](azure-openai-token-limit-policy.md) and [azure-openai-emit-token-metric](azure-openai-emit-token-metric-policy.md) are supported for certain API endpoints exposed through specific Azure OpenAI models. For more information, see [Supported Azure OpenAI in Foundry Models](azure-openai-token-limit-policy.md).
3939
4040
- Permissions to grant access to the Azure OpenAI resource from the API Management instance.
4141

42-
## Option 1. Import API from Azure OpenAI Service
42+
## Option 1. Import API from Azure OpenAI
4343

44-
You can import an Azure OpenAI API directly from Azure OpenAI Service to API Management.
44+
You can import an Azure OpenAI API directly from Azure OpenAI to API Management.
4545

4646
[!INCLUDE [api-management-workspace-availability](../../includes/api-management-workspace-availability.md)]
4747

4848
When you import the API, API Management automatically configures:
4949

5050
* Operations for each of the Azure OpenAI [REST API endpoints](/azure/ai-services/openai/reference)
5151
* A system-assigned identity with the necessary permissions to access the Azure OpenAI resource.
52-
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Azure OpenAI Service endpoint.
52+
* A [backend](backends.md) resource and a [set-backend-service](set-backend-service-policy.md) policy that direct API requests to the Azure OpenAI endpoint.
5353
* Authentication to the Azure OpenAI backend using the instance's system-assigned managed identity.
5454
* (optionally) Policies to help you monitor and manage the Azure OpenAI API.
5555

5656
To import an Azure OpenAI API to API Management:
5757

5858
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
5959
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
60-
1. Under **Create from Azure resource**, select **Azure OpenAI Service**.
60+
1. Under **Create from Azure resource**, select **Azure OpenAI**.
6161

62-
:::image type="content" source="media/azure-openai-api-from-specification/azure-openai-api.png" alt-text="Screenshot of creating an API from Azure OpenAI Service in the portal." :::
62+
:::image type="content" source="media/azure-openai-api-from-specification/azure-openai-api.png" alt-text="Screenshot of creating an API from Azure OpenAI in the portal." :::
6363

6464
1. On the **Basics** tab:
6565
1. Select the Azure OpenAI resource that you want to import.
@@ -89,7 +89,7 @@ Alternatively, manually download the OpenAPI specification for the Azure OpenAI
8989
Download the OpenAPI specification for the Azure OpenAI REST API, such as the [2024-10-21 GA version](https://github.com/Azure/azure-rest-api-specs/blob/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference/stable/2024-10-21/inference.json).
9090

9191
1. In a text editor, open the specification file that you downloaded.
92-
1. In the `servers` element in the specification, substitute the name of your Azure OpenAI Service endpoint in the placeholder values of `url` and `default` endpoint in the specification. For example, if your Azure OpenAI Service endpoint is `contoso.openai.azure.com`, update the `servers` element with the following values:
92+
1. In the `servers` element in the specification, substitute the name of your Azure OpenAI endpoint in the placeholder values of `url` and `default` endpoint in the specification. For example, if your Azure OpenAI endpoint is `contoso.openai.azure.com`, update the `servers` element with the following values:
9393

9494
* **url**: `https://contoso.openai.azure.com/openai`
9595
* **default** endpoint: `contoso.openai.azure.com`
@@ -135,9 +135,9 @@ To ensure that your Azure OpenAI API is working as expected, test it in the API
135135
1. Select an operation that's compatible with the model you deployed in the Azure OpenAI resource.
136136
The page displays fields for parameters and headers.
137137
1. In **Template parameters**, enter the following values:
138-
* `deployment-id` - the ID of a deployment in the Azure OpenAI service
138+
* `deployment-id` - the ID of a deployment in the Azure OpenAI
139139
* `api-version` - a valid Azure OpenAI API version, such as the API version you selected when you imported the API.
140-
:::image type="content" source="media/azure-openai-api-from-specification/test-azure-openai-api.png" alt-text="Screenshot of testing an Azure OpenAI Service API in the portal." lightbox="media/azure-openai-api-from-specification/test-azure-openai-api.png" :::
140+
:::image type="content" source="media/azure-openai-api-from-specification/test-azure-openai-api.png" alt-text="Screenshot of testing an Azure OpenAI API in the portal." lightbox="media/azure-openai-api-from-specification/test-azure-openai-api.png" :::
141141
1. Enter other parameters and headers as needed. Depending on the operation, you might need to configure or update a **Request body**.
142142
> [!NOTE]
143143
> In the test console, API Management automatically populates an **Ocp-Apim-Subscription-Key** header, and configures the subscription key of the built-in [all-access subscription](api-management-subscriptions.md#all-access-subscription). This key enables access to every API in the API Management instance. Optionally display the **Ocp-Apim-Subscription-Key** header by selecting the "eye" icon next to the **HTTP Request**.

articles/api-management/azure-openai-enable-semantic-caching.md

Lines changed: 3 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -25,6 +25,7 @@ Enable semantic caching of responses to Azure OpenAI API requests to reduce band
2525

2626
* One or more Azure OpenAI in Foundry Models APIs must be added to your API Management instance. For more information, see [Add an Azure OpenAI API to Azure API Management](azure-openai-api-from-specification.md).
2727
* The Azure OpenAI instance must have deployments for the following:
28+
2829
* Chat Completion API - Deployment used for API consumer calls
2930
* Embeddings API - Deployment used for semantic caching
3031
* The API Management instance must be configured to use managed identity authentication to the Azure OpenAI APIs. For more information, see [Authenticate and authorize access to Azure OpenAI APIs using Azure API Management ](api-management-authenticate-authorize-azure-openai.md#authenticate-with-managed-identity).
@@ -58,10 +59,8 @@ Configure a [backend](backends.md) resource for the embeddings API deployment wi
5859

5960
* **Name** - A name of your choice, such as `embeddings-backend`. You use this name to reference the backend in policies.
6061
* **Type** - Select **Custom URL**.
61-
* **Runtime URL** - The URL of the embeddings API deployment in the Azure OpenAI instance, similar to:
62-
```
63-
https://my-aoai.openai.azure.com/openai/deployments/embeddings-deployment/embeddings
64-
```
62+
* **Runtime URL** - The URL of the embeddings API deployment in Azure OpenAI, similar to: `https://my-aoai.openai.azure.com/openai/deployments/embeddings-deployment/embeddings`
63+
6564
* **Authorization credentials** - Go to **Managed Identity** tab.
6665
* **Client identity** - Select *System assigned identity* or type in a User assigned managed identity client ID.
6766
* **Resource ID** - Enter `https://cognitiveservices.azure.com/` for Azure OpenAI.

articles/api-management/azure-openai-token-limit-policy.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ms.author: danlep
1818

1919
[!INCLUDE [api-management-availability-premium-dev-standard-basic-premiumv2-standardv2-basicv2](../../includes/api-management-availability-premium-dev-standard-basic-premiumv2-standardv2-basicv2.md)]
2020

21-
The `azure-openai-token-limit` policy prevents Azure OpenAI Service API usage spikes on a per key basis by limiting consumption of language model tokens to a specified rate (number per minute), a quota over a specified period, or both. When a specified token rate limit is exceeded, the caller receives a `429 Too Many Requests` response status code. When a specified quota is exceeded, the caller receives a `403 Forbidden` response status code.
21+
The `azure-openai-token-limit` policy prevents Azure OpenAI in Foundry Models API usage spikes on a per key basis by limiting consumption of language model tokens to a specified rate (number per minute), a quota over a specified period, or both. When a specified token rate limit is exceeded, the caller receives a `429 Too Many Requests` response status code. When a specified quota is exceeded, the caller receives a `403 Forbidden` response status code.
2222

2323
By relying on token usage metrics returned from the OpenAI endpoint, the policy can accurately monitor and enforce limits in real time. The policy also enables precalculation of prompt tokens by API Management, minimizing unnecessary requests to the OpenAI backend if the limit is already exceeded.
2424

@@ -70,8 +70,8 @@ By relying on token usage metrics returned from the OpenAI endpoint, the policy
7070
### Usage notes
7171

7272
* This policy can be used multiple times per policy definition.
73-
* This policy can optionally be configured when adding an API from the Azure OpenAI Service using the portal.
74-
* Where available when `estimate-prompt-tokens` is set to `false`, values in the usage section of the response from the Azure OpenAI Service API are used to determine token usage.
73+
* This policy can optionally be configured when adding an API from the Azure OpenAI using the portal.
74+
* Where available when `estimate-prompt-tokens` is set to `false`, values in the usage section of the response from the Azure OpenAI API are used to determine token usage.
7575
* Certain Azure OpenAI endpoints support streaming of responses. When `stream` is set to `true` in the API request to enable streaming, prompt tokens are always estimated, regardless of the value of the `estimate-prompt-tokens` attribute. Completion tokens are also estimated when responses are streamed.
7676
* For models that accept image input, image tokens are generally counted by the backend language model and included in limit and quota calculations. However, when streaming is used or `estimate-prompt-tokens` is set to `true`, the policy currently over-counts each image as a maximum count of 1200 tokens.
7777
* [!INCLUDE [api-management-rate-limit-key-scope](../../includes/api-management-rate-limit-key-scope.md)]

articles/azure-functions/functions-bindings-openai.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@ The OpenAI bindings have an `AIConnectionName` property that you can use to spec
105105

106106
| Setting name | Description |
107107
|---|---|
108-
| `<CONNECTION_NAME_PREFIX>__endpoint` | Sets the URI endpoint of Azure OpenAI. This setting is always required. |
108+
| `<CONNECTION_NAME_PREFIX>__endpoint` | Sets the URI endpoint of the Azure OpenAI in Foundry Models. This setting is always required. |
109109
| `<CONNECTION_NAME_PREFIX>__clientId` | Sets the specific user-assigned identity to use when obtaining an access token. Requires that `<CONNECTION_NAME_PREFIX>__credential` is set to `managedidentity`. The property accepts a client ID corresponding to a user-assigned identity assigned to the application. It's invalid to specify both a Resource ID and a client ID. If not specified, the system-assigned identity is used. This property is used differently in [local development scenarios](functions-reference.md#local-development-with-identity-based-connections), when `credential` shouldn't be set. |
110110
| `<CONNECTION_NAME_PREFIX>__credential` | Defines how an access token is obtained for the connection. Use `managedidentity` for managed identity authentication. This value is only valid when a managed identity is available in the hosting environment. |
111111
| `<CONNECTION_NAME_PREFIX>__managedIdentityResourceId` | When `credential` is set to `managedidentity`, this property can be set to specify the resource Identifier to be used when obtaining a token. The property accepts a resource identifier corresponding to the resource ID of the user-defined managed identity. It's invalid to specify both a resource ID and a client ID. If neither are specified, the system-assigned identity is used. This property is used differently in [local development scenarios](functions-reference.md#local-development-with-identity-based-connections), when `credential` shouldn't be set. |

includes/api-management-azure-openai-models.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,9 +8,9 @@ ms.date: 07/09/2024
88
ms.author: danlep
99
---
1010

11-
## Supported Azure OpenAI Service models
11+
## Supported Azure OpenAI in Foundry Models
1212

13-
The policy is used with APIs [added to API Management from the Azure OpenAI Service](../articles/api-management/azure-openai-api-from-specification.md) of the following types:
13+
The policy is used with APIs [added to API Management from the Azure OpenAI in Foundry Models](../articles/api-management/azure-openai-api-from-specification.md) of the following types:
1414

1515
| API type | Supported models |
1616
|-------|-------------|
@@ -22,5 +22,5 @@ The policy is used with APIs [added to API Management from the Azure OpenAI Serv
2222
> [!NOTE]
2323
> Traditional completion APIs are only available with legacy model versions and support is limited.
2424
25-
For current information about the models and their capabilities, see [Azure OpenAI Service models](/azure/ai-services/openai/concepts/models).
25+
For current information about the models and their capabilities, see [Azure OpenAI in Foundry Models](/azure/ai-services/openai/concepts/models).
2626

0 commit comments

Comments
 (0)