You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/api-management/openai-compatible-google-gemini-api.md
+17-13Lines changed: 17 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ ms.service: azure-api-management
5
5
author: dlepow
6
6
ms.author: danlep
7
7
ms.topic: how-to
8
-
ms.date: 07/03/2025
8
+
ms.date: 07/06/2025
9
9
ms.collection: ce-skilling-ai-copilot
10
10
ms.custom: template-how-to
11
11
---
@@ -18,16 +18,16 @@ This article shows you how to import an OpenAI-compatible Google Gemini API to a
18
18
19
19
Learn more about managing AI APIs in API Management:
20
20
21
-
*[Generative AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
22
-
*[Import an OpenAI-compatible language model API](openai-compatible-llm-api.md)
21
+
*[AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
22
+
*[Import a language model API](openai-compatible-llm-api.md)
23
23
24
24
## Prerequisites
25
25
26
26
- An existing API Management instance. [Create one if you haven't already](get-started-create-service-instance.md).
27
27
- An API key for the Gemini API. If you don't have one, create it at [Google AI Studio](https://aistudio.google.com/apikey) and store it in a safe location.
28
28
29
29
30
-
## Import an OpenAI-compatible Gemini model using the portal
30
+
## Import an OpenAI-compatible Gemini API using the portal
31
31
32
32
1. In the [Azure portal](https://portal.azure.com), navigate to your API Management instance.
33
33
1. In the left menu, under **APIs**, select **APIs** > **+ Add API**.
@@ -37,15 +37,17 @@ Learn more about managing AI APIs in API Management:
37
37
1. In **URL**, enter the following base URL from the [Gemini OpenAI compatibility documentation](https://ai.google.dev/gemini-api/docs/openai):
1. In **Path**, append a path that your API Management instance uses to route requests to the Gemini API endpoints.
41
-
1. In **Type**, select **Create OpenAI API**.
42
-
1. In **Access key**, enter the following:
43
-
1.**Header name**: *Authorization*.
44
-
1.**Header value (key)**: `Bearer` followed by your API key for the Gemini API.
45
-
1. On the remaining tabs, optionally configure policies to manage token consumption, semantic caching, and AI content safety. For details, see [Import an OpenAI-compatible language model API](openai-compatible-llm-api.md).
46
-
40
+
1. In **Path**, append a path that your API Management instance uses to route requests to the Gemini API endpoints.
41
+
1. In **Type**, select **Create OpenAI API**.
42
+
1. In **Access key**, enter the following:
43
+
1.**Header name**: *Authorization*.
44
+
1.**Header value (key)**: `Bearer` followed by your API key for the Gemini API.
45
+
47
46
:::image type="content" source="media/openai-compatible-google-gemini-api/gemini-import.png" alt-text="Screenshot of importing a Gemini LLM API in the portal.":::
48
-
1. Select **Create**.
47
+
48
+
1. On the remaining tabs, optionally configure policies to manage token consumption, semantic caching, and AI content safety. For details, see [Import a language model API](openai-compatible-llm-api.md).
49
+
1. Select **Review**
50
+
1. After settings are validated, select **Create**.
49
51
50
52
API Management creates the API and configures the following:
51
53
@@ -83,4 +85,6 @@ After importing the API, you can test the chat completions endpoint for the API.
83
85
84
86
:::image type="content" source="media/openai-compatible-google-gemini-api/gemini-test.png" alt-text="Screenshot of testing a Gemini LLM API in the portal.":::
You can import OpenAI-compatible language model endpoints to your API Management instance as APIs. For example, you might want to manage an LLM that you self-host, or that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
18
+
You can import OpenAI-compatible language model endpoints to your API Management instance as APIs. You can also import language models that aren't compatible with OpenAI as passthrough APIs, which forward requests to the backend. For example, you might want to manage an LLM that you self-host, or that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
19
19
20
20
Learn more about managing AI APIs in API Management:
21
21
22
-
*[Generative AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
22
+
*[AI gateway capabilities in Azure API Management](genai-gateway-capabilities.md)
23
23
24
24
## Language model API types
25
25
@@ -92,4 +92,6 @@ To ensure that your LLM API is working as expected, test it in the API Managemen
92
92
93
93
When the test is successful, the backend responds with a successful HTTP response code and some data. Appended to the response is token usage data to help you monitor and manage your language model token consumption.
0 commit comments