You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Content filtering for Azure AI Foundry Models in Azure AI Foundry Service
14
+
15
+
# Content filtering for Azure AI Foundry Models
16
+
15
17
16
18
> [!IMPORTANT]
17
19
> The content filtering system isn't applied to prompts and completions processed by audio models such as Whisper in Azure OpenAI in Azure AI Foundry Models. Learn more about the [Audio models in Azure OpenAI](../../../ai-services/openai/concepts/models.md?tabs=standard-audio#standard-deployment-regional-models-by-endpoint).
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/faq.yml
+6-8Lines changed: 6 additions & 8 deletions
Original file line number
Diff line number
Diff line change
@@ -12,18 +12,16 @@ metadata:
12
12
author: santiagxf
13
13
title: Foundry Models frequently asked questions
14
14
summary: |
15
-
If you can't find answers to your questions in this document, and still need help check the [Azure AI Foundry services (formerly known Azure AI Services)support options guide](../../ai-services/cognitive-services-support-options.md?context=/azure/ai-services/openai/context/context).
15
+
If you can't find answers to your questions in this document, and still need help check the [Azure AI Foundry services (formerly known Azure AI Services)support options guide](../../ai-services/cognitive-services-support-options.md?context=/azure/ai-services/openai/context/context).
16
16
sections:
17
17
- name: General
18
18
questions:
19
19
- question: |
20
-
What's the difference between Azure OpenAI in Foundry Models and Foundry Models?
20
+
What's the difference between Azure OpenAI and Foundry Models?
21
21
answer: |
22
-
Azure OpenAI gives customers access to advanced language models from OpenAI. Foundry Models extends such capability giving customers access to all the flagship models in Azure AI under the same service, endpoint, and credentials. It includes Azure OpenAI, Cohere, Mistral AI, Meta Llama, AI21 labs, etc. Customers can seamlessly switch between models without changing their code.
22
+
Azure OpenAI gives customers access to advanced language models from OpenAI. Foundry Models extends such capability giving customers access to all the flagship models in Azure AI Foundry under the same service, endpoint, and credentials. It includes Azure OpenAI, Cohere, Mistral AI, Meta Llama, AI21 labs, etc. Customers can seamlessly switch between models without changing their code.
23
23
24
-
Both Azure OpenAI and Foundry Models are part of the Azure AI Foundry services (formerly known Azure AI Services) family and build on top of the same security and enterprise promise of Azure.
25
-
26
-
While Foundry Models focus on inference, Azure OpenAI can be used with more advanced APIs like batch, fine-tuning, assistants, and files.
24
+
Azure OpenAI is an Azure Direct model family in Foundry Models.
27
25
- question: |
28
26
What's the difference between Azure AI services and Azure AI Foundry?
29
27
answer: |
@@ -90,13 +88,13 @@ sections:
90
88
- question: |
91
89
Where can I see the bill details?
92
90
answer: |
93
-
Billing and costs are displayed in [Azure Cost Management + Billing](/azure/cost-management-billing/understand/download-azure-daily-usage). You can see the usage details in the [Azure portal](https://portal.azure.com).
91
+
Billing and costs are displayed in [Microsoft Cost Management + Billing](/azure/cost-management-billing/understand/download-azure-daily-usage). You can see the usage details in the [Azure portal](https://portal.azure.com).
94
92
95
93
Billing isn't shown in Azure AI Foundry portal.
96
94
- question: |
97
95
How can I place a spending limit to my bill?
98
96
answer: |
99
-
You can set up a spending limit in the [Azure portal](https://portal.azure.com) under **Azure Cost Management + Billing**. This limit prevents you from spending more than the limit you set. Once spending limit is reached, the subscription will be disabled and you won't be able to use the endpoint until the next billing cycle.
97
+
You can set up a spending limit in the [Azure portal](https://portal.azure.com) under **Microsoft Cost Management + Billing**. This limit prevents you from spending more than the limit you set. Once spending limit is reached, the subscription will be disabled and you won't be able to use the endpoint until the next billing cycle.
description: Learn how to configure access to Azure Ecosystem Models.
4
+
author: santiagxf
5
+
ms.author: fasantia
6
+
ms.service: azure-ai-model-inference
7
+
ms.topic: how-to
8
+
ms.date: 5/11/2025
9
+
---
10
+
11
+
# Configure access to Azure Ecosystem Models
12
+
13
+
Certain models in AI Foundry Models are offered directly by the model provider through the Azure Marketplace. This article explains the requirements to use Azure Marketplace if you plan to use such models in your workloads. Azure Direct Models, like DeepSeek or Phi, or Azure OpenAI Service models, like GPTs, don't have this requirement.
14
+
15
+
> [!TIP]
16
+
> All models offered in AI Foundry Models are hosted in Microsoft's Azure environment and the Service does NOT interact with any external services or model provider.
17
+
18
+
:::image type="content" source="../media/configure-marketplace/azure-marketplace-3p.png" alt-text="A diagram with the overall architecture of Azure Marketplace integration with AI Foundry Models." lightbox="../media/configure-marketplace/azure-marketplace-3p.png":::
Azure Ecosystem Models with Pay-as-you-go billing is available only to users whose Azure subscription belongs to a billing account in a country/region where the model offer is available. Availability varies per model provider and model SKU. Read [Region availability for models](../../how-to/deploy-models-serverless-availability.md).
25
+
26
+
## Troubleshooting
27
+
28
+
Use the following troubleshooting guide to find and solve errors when deploying third-party models in AI Foundry Models:
29
+
30
+
| Error | Description |
31
+
|-------|-------------|
32
+
| This offer is not made available by the provider in the country where your account and Azure Subscription are registered. | The model provider didn't make the specific model SKU available in the country where the subscription is registered. Each model provider may decide to make the offer available in specific countries and such may vary by model SKU. You need to deploy the model to a subscription having billing on a supported country. See the list of countries at [Region availability for models](../../how-to/deploy-models-serverless-availability.md). |
33
+
| Marketplace Subscription purchase eligibility check failed. | The model provider didn't make the specific model SKU available in the country where the subscription is registered or it isn't available in the region where you deployed the Azure AI Services resource. See [Region availability for models](../../how-to/deploy-models-serverless-availability.md). |
34
+
| Unable to create a model deployment for model "model-name". If the error persists, please contact HIT (Human Intelligence Team) via this link: https://go.microsoft.com/fwlink/?linkid=2101400&clcid=0x409 and request to allowlist the Azure subscription. | Azure Marketplace rejects the request to create a model subscription. Such can be due to multiple reasons, including subscribing to the model offering too often, or from multiple subscriptions at the same time. Please contact support using the provided link indicating your subscription ID. |
35
+
| This offer is not available for purchasing by subscriptions belonging to Microsoft Azure Cloud Solution Providers | Cloud Solution Provider (CSP) subscriptions do not have the ability to purchase third-party model offerings. You can consider models offered as first-party consumption service. |
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/how-to/github/create-model-deployments.md
+10-7Lines changed: 10 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,26 +12,28 @@ ms.author: fasantia
12
12
recommendations: false
13
13
---
14
14
15
-
# Add and configure models to Azure AI services
15
+
# Add and configure models from Azure AI Foundry Models
16
16
17
17
You can decide and configure which models are available for inference in the Azure AI services resource model's inference endpoint. When a given model is configured, you can then generate predictions from it by indicating its model name or deployment name on your requests. No further changes are required in your code to use it.
18
18
19
-
In this article, you learn how to add a new model to Azure AI Foundry Models.
19
+
20
+
In this article, you learn how to add a new model from Azure AI Foundry Models.
20
21
21
22
## Prerequisites
22
23
23
24
To complete this article, you need:
24
25
25
-
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Foundry Models](../quickstart-github-models.md) if it's your case.
26
-
* An Azure AI services resource. For more information, see [Create an Azure AI Services resource](../../../../ai-services/multi-service-resource.md?context=/azure/ai-services/model-inference/context/context).
26
+
* An Azure subscription. If you're using [GitHub Models](https://docs.github.com/en/github-models/), you can upgrade your experience and create an Azure subscription in the process. Read [Upgrade from GitHub Models to Azure AI Foundry Models](../quickstart-github-models.md) if it's your case.
27
+
* An Azure AI services resource. For more information, see [Create an Azure AI Foundry resource](../quickstart-create-resources.md).
Deployed models in Azure AI services can be consumed using the [Azure AI model's inference endpoint](../../concepts/endpoints.md) for the resource.
36
+
Deployed models in Azure AI Foundry Models can be consumed using the [Azure AI model's inference endpoint](../../concepts/endpoints.md) for the resource.
35
37
36
38
To use it:
37
39
@@ -52,6 +54,7 @@ When creating model deployments, you can configure additional settings including
52
54
> [!NOTE]
53
55
> Configurations may vary depending on the model you're deploying.
54
56
55
-
## Next steps
57
+
## Related content
58
+
59
+
*[Develop applications using Azure AI Foundry Models](../../supported-languages.md)
56
60
57
-
*[Develop applications using Foundry Models service in Azure AI services](../../supported-languages.md)
Copy file name to clipboardExpand all lines: articles/ai-foundry/model-inference/how-to/manage-costs.md
+18-17Lines changed: 18 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,6 @@
1
1
---
2
-
title: Plan to manage costs for Azure AI Foundry Models in Azure AI Foundry Service
3
-
description: Learn how to plan for and manage costs for Azure AI Foundry Models in Azure AI Foundry Service by using cost analysis in the Azure portal.
2
+
title: Plan to manage costs for Azure AI Foundry Models
3
+
description: Learn how to plan for and manage costs for Azure AI Foundry Models by using cost analysis in the Azure portal.
4
4
author: santiagxf
5
5
ms.author: fasantia
6
6
ms.custom: subject-cost-optimization
@@ -9,12 +9,12 @@ ms.topic: how-to
9
9
ms.date: 1/21/2025
10
10
---
11
11
12
+
# Plan to manage costs for Azure AI Foundry Models
12
13
13
-
# Plan to manage costs for Azure AI Foundry Models in Azure AI Foundry Service
14
+
This article describes how you can view, plan for, and manage costs for Azure AI Foundry Models.
14
15
15
-
This article describes how you can view, plan for, and manage costs for Foundry Models in Azure AI Foundry Service.
16
+
Although this article is about planning for and managing costs for Azure AI Foundry Models, you're billed for all Azure services and resources used in your Azure subscription.
16
17
17
-
Although this article is about planning for and managing costs for Foundry Models in Azure AI Foundry Service, you're billed for all Azure services and resources used in your Azure subscription.
18
18
19
19
## Prerequisites
20
20
@@ -24,17 +24,18 @@ Although this article is about planning for and managing costs for Foundry Model
24
24
25
25
## Understand Foundry Models billing model
26
26
27
-
Language models understand and process inputs by breaking them down into tokens. For reference, each token is roughly four characters for typical English text. Models that can process images or audio break down them into tokens too for billing purposes. The number of tokens per image or audio content depends on the model and the resolution of the input.
28
27
29
-
Costs per token vary depending on which model series you choose but in all cases models deployed in Azure AI Services are charged per 1,000 tokens. Token costs are for both input and output. For example, suppose you have a 1,000 token JavaScript code sample that you ask a model to convert to Python. You would be charged approximately 1,000 tokens for the initial input request sent, and 1,000 more tokens for the output that is received in response for a total of 2,000 tokens.
28
+
Language models understand and process inputs by breaking them down into tokens. For reference, each token is roughly four characters for typical English text. Models that can process images or audio break them down into tokens too for billing purposes. The number of tokens per image or audio content depends on the model and the resolution of the input.
29
+
30
+
Costs per token vary depending on which model series you choose but in all cases models deployed in Azure AI Foundry are charged per 1,000 tokens. Token costs are for both input and output. For example, suppose you have a 1,000 token JavaScript code sample that you ask a model to convert to Python. You would be charged approximately 1,000 tokens for the initial input request sent, and 1,000 more tokens for the output that is received in response for a total of 2,000 tokens.
30
31
31
32
### Cost breakdown
32
33
33
34
To understand the breakdown of what makes up the cost, it can be helpful to use **Cost Analysis** tool in Azure portal. Follow these steps to understand the cost of inference:
34
35
35
36
1. Go to [Azure AI Foundry Portal](https://ai.azure.com).
36
37
37
-
2. In the upper right corner of the screen, select on the name of your Azure AI Services resource, or if you're working on an AI project, on the name of the project.
38
+
2. In the upper right corner of the screen, select on the name of your Azure AI Foundry resource (formerly known as Azure AI Services), or if you're working on an AI project, on the name of the project.
38
39
39
40
3. Select the name of the project. Azure portal opens in a new window.
40
41
@@ -45,32 +46,32 @@ To understand the breakdown of what makes up the cost, it can be helpful to use
45
46
5. By default, cost analysis is scoped to the selected resource group.
46
47
47
48
> [!IMPORTANT]
48
-
> It's important to scope *Cost Analysis* to the resource group where the Azure AI Services resource is deployed. Cost meters associated with some provider model providers, like Mistral AI or Cohere, are displayed under the resource group instead of the Azure AI Services resource.
49
+
> It's important to scope *Cost Analysis* to the resource group where the Azure AI Foundry resource is deployed. Cost meters associated with [Azure Ecosystem Models](#azure-ecosystem-models)are displayed under the resource group instead of the Azure AI Foundry resource.
49
50
50
51
6. Modify **Group by** to **Meter**. You can now see that for this particular resource group, the source of the costs comes from different models series.
51
52
52
53
:::image type="content" source="../media/manage-cost/cost-by-meter.png" alt-text="Screenshot of how to see the cost by each meter in the resource group." lightbox="../media/manage-cost/cost-by-meter.png":::
53
54
54
55
The following sections explain the entries in details.
55
56
56
-
### Azure OpenAI and Microsoft models
57
+
### Azure Direct Models
57
58
58
-
Azure OpenAI models and models offered as first-party consumption services from Microsoft (including DeepSeek family and Phi family of models) are charged directly and they show up as billing meters under each Azure AI services resource. This billing happens directly through Microsoft. When you inspect your bill, you notice billing meters accounting for inputs and outputs for each consumed model.
59
+
[Azure Direct Models](../concepts/models.md#azure-direct-models)(including Azure OpenAI) are charged directly and they show up as billing meters under each Azure AI Foundry resource (formerly known Azure AI Services). This billing happens directly through Microsoft. When you inspect your bill, you notice billing meters accounting for inputs and outputs for each consumed model.
59
60
60
-
:::image type="content" source="../media/manage-cost/cost-by-meter-1p.png" alt-text="Screenshot of cost analysis dashboard scoped to the resource group where the Azure AI Services resource is deployed, highlighting the meters for Azure OpenAI and Microsoft's models. Cost is group by meter." lightbox="../media/manage-cost/cost-by-meter-1p.png":::
61
+
:::image type="content" source="../media/manage-cost/cost-by-meter-1p.png" alt-text="Screenshot of cost analysis dashboard scoped to the resource group where the Azure AI Foundry resource is deployed, highlighting the meters for Azure OpenAI and Phi models. Cost is group by meter." lightbox="../media/manage-cost/cost-by-meter-1p.png":::
61
62
62
-
### Provider models
63
+
### Azure Ecosystem models
63
64
64
-
Models provided by another provider, like Mistral AI, Cohere, Meta AI, or AI21 Labs, are billed using Azure Marketplace. As opposite to Microsoft billing meters, those entries are associated with the resource group where your Azure AI services is deployed instead of to the Azure AI Services resource itself. Given model providers charge you directly, you see entries under the category **Marketplace** and **Service Name***SaaS* accounting for inputs and outputs for each consumed model.
65
+
Models provided by third-party providers, like Cohere, are billed using Azure Marketplace. As opposite to Microsoft billing meters, those entries are associated with the resource group where your Azure AI Foundry (formerly known as Azure AI Services) is deployed instead of to the Azure AI Foundry resource itself. Given model providers charge you directly, you see entries under the category **Marketplace** and **Service Name***SaaS* accounting for inputs and outputs for each consumed model.
65
66
66
-
:::image type="content" source="../media/manage-cost/cost-by-meter-saas.png" alt-text="Screenshot of cost analysis dashboard scoped to the resource group where the Azure AI Services resource is deployed, highlighting the meters for models billed throughout Azure Marketplace. Cost is group by meter." lightbox="../media/manage-cost/cost-by-meter-saas.png":::
67
+
:::image type="content" source="../media/manage-cost/cost-by-meter-saas.png" alt-text="Screenshot of cost analysis dashboard scoped to the resource group where the Azure AI Foundry resource is deployed, highlighting the meters for models billed throughout Azure Marketplace. Cost is group by meter." lightbox="../media/manage-cost/cost-by-meter-saas.png":::
67
68
68
69
> [!IMPORTANT]
69
-
> This distinction between Azure OpenAI, Microsoft-offered models, and provider models only affects how the model is made available to you and how you are charged. In all cases, models are hosted within Azure cloud and there is no interaction with external services or providers.
70
+
> This distinction between [Azure Direct Models](../concepts/models.md#azure-direct-models) (including Azure OpenAI) and [Azure Ecosystem Models](../concepts/models.md#azure-ecosystem-models) only affects how the model is made available to you and how you are charged. In all cases, models are hosted within Azure cloud and there is no interaction with external services or providers.
70
71
71
72
### Using Azure Prepayment
72
73
73
-
You can pay for Azure OpenAI and Microsoft's models charges with your Azure Prepayment credit. However, you can't use Azure Prepayment credit to pay for charges for other provider models given they're billed through Azure Marketplace.
74
+
You can pay for Azure Direct Models' charges with your Azure Prepayment credit. However, you can't use Azure Prepayment credit to pay for charges for other provider models given they're billed through Azure Marketplace.
0 commit comments