Skip to content

Commit ceca506

Browse files
author
gitName
committed
line edits
1 parent 7b1195e commit ceca506

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

articles/api-management/openai-compatible-google-gemini-api.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ Learn more about managing AI APIs in API Management:
4848
:::image type="content" source="media/openai-compatible-google-gemini-api/gemini-import.png" alt-text="Screenshot of importing a Gemini LLM API in the portal.":::
4949

5050
1. On the remaining tabs, optionally configure policies to manage token consumption, semantic caching, and AI content safety. For details, see [Import a language model API](openai-compatible-llm-api.md).
51-
1. Select **Review**
51+
1. Select **Review**.
5252
1. After settings are validated, select **Create**.
5353

5454
API Management creates the API and configures the following:

articles/api-management/openai-compatible-llm-api.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.custom: template-how-to
1515

1616
[!INCLUDE [api-management-availability-all-tiers](../../includes/api-management-availability-all-tiers.md)]
1717

18-
You can import OpenAI-compatible language model endpoints to your API Management instance as APIs. You can also import language models that aren't compatible with OpenAI as passthrough APIs, which forward requests to the backend. For example, you might want to manage an LLM that you self-host, or that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
18+
You can import OpenAI-compatible language model endpoints to your API Management instance as APIs. You can also import language models that aren't compatible with OpenAI as passthrough APIs, which forward requests directly to the backend endpoints. For example, you might want to manage an LLM that you self-host, or that's hosted on an inference provider other than Azure AI services. Use AI gateway policies and other capabilities in API Management to simplify integration, improve observability, and enhance control over the model endpoints.
1919

2020
Learn more about managing AI APIs in API Management:
2121

@@ -39,7 +39,7 @@ API Management supports two types of language model APIs for this scenario. Choo
3939
- A self-hosted or non-Azure-provided language model deployment with an API endpoint.
4040

4141

42-
## Import language model API using the portalF
42+
## Import language model API using the portal
4343

4444
When you import the LLM API in the portal, API Management automatically configures:
4545

0 commit comments

Comments
 (0)