You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/ai-red-teaming-agent.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,6 +27,8 @@ The AI Red Teaming Agent leverages Microsoft's open-source framework for Python
27
27
28
28
Together these components (scanning, evaluating, and reporting) help teams understand how AI systems respond to common attacks, ultimately guiding a comprehensive risk management strategy.
When thinking about AI-related safety risks developing trustworthy AI systems, Microsoft uses NIST's framework to mitigate risk effectively: Govern, Map, Measure, Manage. We'll focus on the last three parts in relation to the generative AI development lifecycle:
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/ai-resources.md
+17-5Lines changed: 17 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -120,14 +120,26 @@ You can use [cost management](/azure/cost-management-billing/costs/quick-acm-cos
120
120
121
121
## Find Azure AI Foundry resources in the Azure portal
122
122
123
-
In the Azure portal, you can find resources that correspond to your project in Azure AI Foundry portal.
123
+
In the [Azure portal](https://portal.azure.com), search for and then select **Azure AI Foundry** entry. From the AI Foundry section of the portal, you can find your AI Foundry resources.
124
+
125
+
- The **All resources** section lists all resources.
126
+
- The **AI Foundry** section lists [!INCLUDE [fdp](../includes/fdp-project-name.md)] resources.
127
+
- The **AI Hubs** section lists [!INCLUDE [hub](../includes/hub-project-name.md)] resources.
128
+
- The **Azure OpenAI** section lists Azure OpenAI resources.
129
+
- The **AI Search** section lists Azure AI Search resources.
130
+
- Use the **More services** and **Classic AI services** sections to find other Azure AI services.
131
+
132
+
:::image type="content" source="../media/portal/overview.png" lightbox="../media/portal/overview.png" alt-text="Screenshot of the Azure AI Foundry overview page in the Azure portal.":::
133
+
134
+
You can also go directly to your hub and project resources in the Azure portal from the Azure AI Foundry portal by using the following steps:
124
135
125
136
> [!NOTE]
126
-
> This section assumes that the hub and project are in the same resource group.
127
-
1. In [Azure AI Foundry](https://ai.azure.com), go to a project and select **Management center** to view your project resources.
128
-
1. From the management center, select the overview for either your hub or project and then select the link to **Manage in Azure portal**.
137
+
> These steps assume that the hub and project are in the same resource group.
138
+
139
+
1. In [Azure AI Foundry](https://ai.azure.com), go to the hub or project and select **Management center** to view your project resources.
140
+
1. From the management center, select the overview for the hub or project and then select the link to **Manage in Azure portal**.
129
141
130
-
:::image type="content" source="../media/concepts/azureai-project-view-ai-studio.png" alt-text="Screenshot of the Azure AI Foundry project overview page with links to the Azure portal." lightbox="../media/concepts/azureai-project-view-ai-studio.png":::
142
+
:::image type="content" source="../media/concepts/azureai-project-view-ai-studio.png" alt-text="Screenshot of the Azure AI Foundry project overview page with links to the Azure portal." lightbox="../media/concepts/azureai-project-view-ai-studio.png":::
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/content-filtering.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,7 +20,7 @@ author: PatrickFarley
20
20
[Azure AI Foundry](https://ai.azure.com) includes a content filtering system that works alongside core models and image generation models.
21
21
22
22
> [!IMPORTANT]
23
-
> The content filtering system isn't applied to prompts and completions processed by the Whisper model in Azure OpenAI Service. Learn more about the [Whisper model in Azure OpenAI](../../ai-services/openai/concepts/models.md).
23
+
> The content filtering system isn't applied to prompts and completions processed by the Whisper model in Azure OpenAI in Azure AI Foundry Models. Learn more about the [Whisper model in Azure OpenAI](../../ai-services/openai/concepts/models.md).
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/deployments-overview.md
+8-8Lines changed: 8 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,15 +19,15 @@ The model catalog in Azure AI Foundry portal is the hub to discover and use a wi
19
19
20
20
Deployment options vary depending on the model offering:
21
21
22
-
***Azure OpenAI models:** The latest OpenAI models that have enterprise features from Azure with flexible billing options.
23
-
***Models-as-a-Service models:** These models don't require compute quota from your subscription and are billed per token in a pay-as-you-go fashion.
22
+
***Azure OpenAI in Azure AI Foundry Models:** The latest OpenAI models that have enterprise features from Azure with flexible billing options.
23
+
***Standard deployment:** These models don't require compute quota from your subscription and are billed per token in a pay-as-you-go fashion.
24
24
***Open and custom models:** The model catalog offers access to a large variety of models across modalities, including models of open access. You can host open models in your own subscription with a managed infrastructure, virtual machines, and the number of instances for capacity management.
25
25
26
26
Azure AI Foundry offers four different deployment options:
27
27
28
-
|Name | Azure OpenAI service | Azure AI model inference |Serverless API| Managed compute |
28
+
|Name | Azure OpenAI | Azure AI model inference |Standard deployment| Managed compute |
| Which models can be deployed? |[Azure OpenAI models](../../ai-services/openai/concepts/models.md)|[Azure OpenAI models and Models-as-a-Service](../../ai-foundry/model-inference/concepts/models.md)|[Models-as-a-Service](../how-to/model-catalog-overview.md#content-safety-for-models-deployed-via-serverless-apis)|[Open and custom models](../how-to/model-catalog-overview.md#availability-of-models-for-deployment-as-managed-compute)|
30
+
| Which models can be deployed? |[Azure OpenAI models](../../ai-services/openai/concepts/models.md)|[Azure OpenAI models and Standard deployment](../../ai-foundry/model-inference/concepts/models.md)|[Standard deployment](../how-to/model-catalog-overview.md#content-safety-for-models-deployed-via-serverless-apis)|[Open and custom models](../how-to/model-catalog-overview.md#availability-of-models-for-deployment-as-managed-compute)|
31
31
| Deployment resource | Azure OpenAI resource | Azure AI services resource | AI project resource | AI project resource |
32
32
| Requires Hubs/Projects | No | No | Yes | Yes |
33
33
| Data processing options | Regional <br /> Data-zone <br /> Global | Global | Regional | Regional |
@@ -37,7 +37,7 @@ Azure AI Foundry offers four different deployment options:
37
37
| Key-less authentication | Yes | Yes | No | No |
38
38
| Best suited when | You're planning to use only OpenAI models | You're planning to take advantage of the flagship models in Azure AI catalog, including OpenAI. | You're planning to use a single model from a specific provider (excluding OpenAI). | If you plan to use open models and you have enough compute quota available in your subscription. |
| Deployment instructions |[Deploy to Azure OpenAI Service](../how-to/deploy-models-openai.md)|[Deploy to Azure AI model inference](../model-inference/how-to/create-model-deployments.md)|[Deploy to Serverless API](../how-to/deploy-models-serverless.md)|[Deploy to Managed compute](../how-to/deploy-models-managed.md)|
40
+
| Deployment instructions |[Deploy to Azure OpenAI](../how-to/deploy-models-openai.md)|[Deploy to Azure AI model inference](../model-inference/how-to/create-model-deployments.md)|[Deploy to Standard deployment](../how-to/deploy-models-serverless.md)|[Deploy to Managed compute](../how-to/deploy-models-managed.md)|
41
41
42
42
<sup>1</sup> A minimal endpoint infrastructure is billed per minute. You aren't billed for the infrastructure that hosts the model in pay-as-you-go. After you delete the endpoint, no further charges accrue.
43
43
@@ -54,11 +54,11 @@ Azure AI Foundry encourages you to explore various deployment options and choose
54
54
55
55
* When you're looking to use a specific model:
56
56
57
-
* If you're interested in Azure OpenAI models, use the Azure OpenAI Service. This option is designed for Azure OpenAI models and offers a wide range of capabilities for them.
57
+
* If you're interested in Azure OpenAI models, use Azure OpenAI in Foundry Models. This option is designed for Azure OpenAI models and offers a wide range of capabilities for them.
58
58
59
-
* If you're interested in a particular model from Models-as-a-Service, and you don't expect to use any other type of model, use [Serverless API endpoints](../how-to/deploy-models-serverless.md). Serverless endpoints allow deployment of a single model under a unique set of endpoint URL and keys.
59
+
* If you're interested in a particular model from serverless pay per token offer, and you don't expect to use any other type of model, use [Standard deployment](../how-to/deploy-models-serverless.md). Standard deployments allow deployment of a single model under a unique set of endpoint URL and keys.
60
60
61
-
* When your model isn't available in Models-as-a-Service and you have compute quota available in your subscription, use [Managed Compute](../how-to/deploy-models-managed.md), which supports deployment of open and custom models. It also allows a high level of customization of the deployment inference server, protocols, and detailed configuration.
61
+
* When your model isn't available in standard deployment and you have compute quota available in your subscription, use [Managed Compute](../how-to/deploy-models-managed.md), which supports deployment of open and custom models. It also allows a high level of customization of the deployment inference server, protocols, and detailed configuration.
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/encryption-keys-portal.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -127,7 +127,7 @@ Customer-managed key encryption is configured via Azure portal in a similar way
127
127
128
128
* The customer-managed key for encryption can only be updated to keys in the same Azure Key Vault instance.
129
129
* After deployment, your [!INCLUDE [fdp](../includes/fdp-project-name.md)] can't switch from Microsoft-managed keys to customer-managed keys or vice versa.
130
-
* Azure charges will continue to accrue during the soft delete retention period.
130
+
* Azure charges for the AI Foundry resource will continue to accrue during the soft delete retention period. Charges for projects don't continue to accrue during the soft delete retention period.
The Azure OpenAI Graders are a new set of evaluation graders available in the Azure AI Foundry SDK, aimed at evaluating the performance of AI models and their outputs. These graders including [Label grader](#label-grader), [String checker](#string-checker), [Text similarity](#text-similarity), and [General grader](#general-grader) can be run locally or remotely. Each grader serves a specific purpose in assessing different aspects of AI model/model outputs.
17
19
@@ -209,17 +211,17 @@ The grader also returns a metric indicating the overall dataset pass rate.
209
211
210
212
## General grader
211
213
212
-
Advanced users have the capability to import or define a custom grader and integrate it into the Azure OpenAI general grader. This allows for evaluations to be performed based on specific areas of interest aside from the existing Azure OpenAI graders. Following is an example to import the OpenAI `EvalStringCheckGrader` and construct it to be ran as an Azure OpenAI general grader on Foundry SDK.
214
+
Advanced users have the capability to import or define a custom grader and integrate it into the AOAI general grader. This allows for evaluations to be performed based on specific areas of interest aside from the existing AOAI graders. Following is an example to import the OpenAI `StringCheckGrader` and construct it to be ran as a AOAI general grader on Foundry SDK.
213
215
214
216
### Example
215
217
216
218
```python
217
-
from openai.types.eval_string_check_graderimportEvalStringCheckGrader
219
+
from openai.types.gradersimportStringCheckGrader
218
220
from azure.ai.evaluation import AzureOpenAIGrader
219
-
221
+
220
222
# Define an string check grader config directly using the OAI SDK
221
223
# Evaluation criteria: Pass if query column contains "Northwind"
0 commit comments