Skip to content

Commit d8ff8ba

Browse files
authored
Merge pull request #2547 from santiagxf/santiagxf-patch-2
Update deployment-types.md
2 parents 82c8bad + dfb0dcb commit d8ff8ba

File tree

4 files changed

+6
-5
lines changed

4 files changed

+6
-5
lines changed

articles/ai-foundry/model-inference/concepts/deployment-types.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -31,12 +31,13 @@ To learn more about deployment options for Azure OpenAI models see [Azure OpenAI
3131

3232
Models from third-party model providers with pay-as-you-go billing (collectively called Models-as-a-Service), makes models available in Azure AI model inference under **standard** deployments with a Global processing option (`Global-Standard`).
3333

34-
Models-as-a-Service offers regional deployment options under [Serverless API endpoints](../../../ai-studio/how-to/deploy-models-serverless.md) in Azure AI Foundry. Prompts and outputs are processed within the geography specified during deployment. However, those deployments can't be accessed using the Azure AI model inference endpoint in Azure AI Services.
35-
3634
### Global-Standard
3735

3836
Global deployments leverage Azure's global infrastructure to dynamically route traffic to the data center with best availability for each request. Global standard provides the highest default quota and eliminates the need to load balance across multiple resources. Data stored at rest remains in the designated Azure geography, while data may be processed for inferencing in any Azure location. Learn more about [data residency](https://azure.microsoft.com/explore/global-infrastructure/data-residency/).
3937

38+
> [!NOTE]
39+
> Models-as-a-Service offers regional deployment options under [Serverless API endpoints](../../../ai-studio/how-to/deploy-models-serverless.md) in Azure AI Foundry. Prompts and outputs are processed within the geography specified during deployment. However, those deployments can't be accessed using the Azure AI model inference endpoint in Azure AI Services.
40+
4041
## Control deployment options
4142

4243
Administrators can control which model deployment types are available to their users by using Azure Policies. Learn more about [How to control AI model deployment with custom policies](../../../ai-studio/how-to/custom-policy-model-deployment.md).

articles/ai-foundry/model-inference/faq.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ metadata:
55
description: Get answers to the most popular questions about Azure AI model inference
66
#services: cognitive-services
77
manager: nitinme
8-
ms.service: azure-ai-models
8+
ms.service: azure-ai-model-inference
99
ms.topic: faq
1010
ms.date: 1/21/2025
1111
ms.author: fasantia

articles/ai-foundry/model-inference/index.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ summary: Azure AI model inference provides access to the most powerful models av
66
metadata:
77
title: Azure AI model inference documentation - Quickstarts, How-to's, API Reference - Azure AI Foundry | Microsoft Docs
88
description: Learn how to use flagship models available in the Azure AI model catalog from the key model providers in the industry, including OpenAI, Microsoft, Meta, Mistral, Cohere, G42, and AI21 Labs.
9-
ms.service: azure-ai-models
9+
ms.service: azure-ai-model-inference
1010
ms.custom:
1111
ms.topic: landing-page
1212
author: mrbullwinkle

articles/search/index.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ summary: Information retrieval at scale for vector and text content in tradition
55
metadata:
66
title: Azure AI Search documentation
77
description: Information retrieval at scale for vector and text content in traditional or generative search scenarios.
8-
ms.service: service
8+
ms.service: azure-ai-search
99
ms.custom:
1010
- ignite-2023
1111
- ignite-2024

0 commit comments

Comments
 (0)