You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-foundry/concepts/encryption-keys-portal.md
+3-2Lines changed: 3 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
2
title: Customer-Managed Keys for Azure AI Foundry
3
3
titleSuffix: Azure AI Foundry
4
-
description: Learn about using customer-managed keys for encryption to improve data security with Azure AI Foundry.
4
+
description: Learn how to use customer-managed keys (CMK) for enhanced encryption and data security in Azure AI Foundry. Configure Azure Key Vault integration and meet compliance requirements.
5
5
ms.author: jburchel
6
6
author: jonburchel
7
7
ms.reviewer: deeikele
@@ -13,12 +13,13 @@ ms.custom:
13
13
- build-aifnd
14
14
- build-2025
15
15
zone_pivot_groups: project-type
16
+
ai-usage: ai-assisted
16
17
# Customer intent: As an admin, I want to understand how I can use my own encryption keys with Azure AI Foundry.
17
18
---
18
19
19
20
# Customer-managed keys for encryption with Azure AI Foundry
20
21
21
-
Customer-managed key (CMK) encryption in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) provides enhanced control over the encryption of your data. By using a CMK, you can manage your own encryption keys to add an extra layer of protection and meet compliance requirements more effectively.
22
+
Customer-managed key (CMK) encryption in [Azure AI Foundry](https://ai.azure.com/?cid=learnDocs) provides enhanced control over encryption of your data. Learn how to use customer-managed keys to add an extra layer of protection and meet compliance requirements more effectively with Azure Key Vault integration.
Copy file name to clipboardExpand all lines: articles/ai-foundry/how-to/develop/trace-agents-sdk.md
+11-11Lines changed: 11 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,33 +1,33 @@
1
1
---
2
-
title: How to trace your AI application
2
+
title: View Trace Results for AI Agents in Azure AI Foundry
3
3
titleSuffix: Azure AI Foundry
4
-
description: This article provides instructions on how to trace your application with Azure AI Inference SDK.
4
+
description: View trace results for AI agents using Azure AI Foundry SDK and OpenTelemetry. Learn to see execution traces, debug performance, and monitor AI agent behavior step-by-step.
5
5
author: lgayhardt
6
6
ms.author: lagayhar
7
7
ms.reviewer: amibp
8
8
ms.date: 08/21/2025
9
9
ms.service: azure-ai-foundry
10
10
ms.topic: how-to
11
-
11
+
ai-usage: ai-assisted
12
12
---
13
13
14
-
# Trace your AI agents using Azure AI Foundry portal and SDK (preview)
14
+
# View trace results for AI agents in Azure AI Foundry (preview)
This article walks you through how to instrument tracing in agents using Azure AI Foundry SDK with OpenTelemetry and Azure Monitor for enhanced observability and debugging.
18
+
Learn how to view trace results for AI agents in Azure AI Foundry. This article shows you how to see execution traces, analyze agent behavior, and debug performance issues using Azure AI Foundry SDK with OpenTelemetry and Azure Monitor for enhanced observability.
19
19
20
20
Determining the reasoning behind your agent's executions is important for troubleshooting and debugging. However, it can be difficult for complex agents for a number of reasons:
21
21
* There could be a high number of steps involved in generating a response, making it hard to keep track of all of them.
22
22
* The sequence of steps might vary based on user input.
23
23
* The inputs/outputs at each stage might be long and deserve more detailed inspection.
24
24
* Each step of an agent's runtime might also involve nesting. For example, an agent might invoke a tool, which uses another process, which then invokes another tool. If you notice strange or incorrect output from a top-level agent run, it might be difficult to determine exactly where in the execution the issue was introduced.
25
25
26
-
Tracing solves this by allowing you to clearly see the inputs and outputs of each primitive involved in a particular agent run, in the order in which they were invoked.
26
+
Trace results solve this by allowing you to view the inputs and outputs of each primitive involved in a particular agent run, displayed in the order they were invoked, making it easy to understand and debug your AI agent's behavior.
27
27
28
-
## Tracing in the Azure AI Foundry Agents playground
28
+
## View trace results in the Azure AI Foundry Agents playground
29
29
30
-
The Agents playground in the Azure AI Foundry portal lets you trace threads and runs that your agents produce. To open a trace, select **Thread logs** in an active thread. You can also optionally select **Metrics** to enable automatic evaluations of the model's performance across several dimensions of **AI quality** and **Risk and safety**.
30
+
The Agents playground in the Azure AI Foundry portal lets you view trace results for threads and runs that your agents produce. To see trace results, select **Thread logs** in an active thread. You can also optionally select **Metrics** to enable automatic evaluations of the model's performance across several dimensions of **AI quality** and **Risk and safety**.
31
31
32
32
> [!NOTE]
33
33
> Evaluation results are available for 24 hours before expiring. To get evaluation results, select your desired metrics and chat with your agent.
@@ -39,14 +39,14 @@ The Agents playground in the Azure AI Foundry portal lets you trace threads and
39
39
40
40
:::image type="content" source="../../media/trace/trace-agent-playground.png" alt-text="A screenshot of the agent playground in the Azure AI Foundry portal." lightbox="../../media/trace/trace-agent-playground.png":::
41
41
42
-
After selecting **Thread logs**, the screen that appears will let you view the: thread, run, run steps and any tool calls that were made. You can view the inputs and outputs between the agent and user, as well the associated metadata and any evaluations you selected.
42
+
After selecting **Thread logs**, you can view trace results including: thread details, run information, run steps and any tool calls that were made. The trace results show you the inputs and outputs between the agent and user, as well the associated metadata and any evaluations you selected.
43
43
44
44
:::image type="content" source="../../agents/media/thread-trace.png" alt-text="A screenshot of a trace." lightbox="../../agents/media/thread-trace.png":::
45
45
46
46
> [!TIP]
47
-
> If you want to view the trace of a previous thread, select **My threads** in the **Agents** screen. Choose a thread, and then select **Try in playground**.
47
+
> If you want to view trace results from a previous thread, select **My threads** in the **Agents** screen. Choose a thread, and then select **Try in playground**.
48
48
> :::image type="content" source="../../agents/media/thread-highlight.png" alt-text="A screenshot of the threads screen." lightbox="../../agents/media/thread-highlight.png":::
49
-
> You will be able to see the **Thread logs** button at the top of the screen to view the trace.
49
+
> You will be able to see the **Thread logs** button at the top of the screen to view the trace results.
Copy file name to clipboardExpand all lines: articles/ai-foundry/how-to/develop/trace-application.md
+30-7Lines changed: 30 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,20 +1,21 @@
1
1
---
2
-
title: How to trace AI applications using OpenAI SDK
2
+
title: View Trace Results for AI Applications using OpenAI SDK
3
3
titleSuffix: Azure AI Foundry
4
-
description: Learn how to trace applications that use OpenAI SDK in Azure AI Foundry
4
+
description: View trace results for AI applications using OpenAI SDK with OpenTelemetry in Azure AI Foundry. See execution traces, diagnose issues, and monitor application performance.
5
5
author: lgayhardt
6
6
ms.author: lagayhar
7
7
ms.reviewer: ychen
8
8
ms.date: 08/29/2025
9
9
ms.service: azure-ai-foundry
10
10
ms.topic: how-to
11
+
ai-usage: ai-assisted
11
12
---
12
13
13
-
# Trace AI applications using OpenAI SDK
14
+
# View trace results for AI applications using OpenAI SDK
14
15
15
-
Tracing provides deep visibility into execution of your application by capturing detailed telemetry at each execution step. This helps diagnose issues and enhance performance by identifying problems such as inaccurate tool calls, misleading prompts, high latency, low-quality evaluation scores, and more.
16
+
Learn how to view trace results that provide deep visibility into AI application execution. See detailed telemetry captured at each step to diagnose issues and enhance performance by identifying problems such as inaccurate tool calls, misleading prompts, high latency, and low-quality evaluation scores.
16
17
17
-
This article explains how to implement tracing for AI applications using **OpenAI SDK** with OpenTelemetry in Azure AI Foundry.
18
+
This article explains how to view trace results for AI applications using **OpenAI SDK** with OpenTelemetry in Azure AI Foundry.
18
19
19
20
## Prerequisites
20
21
@@ -66,6 +67,28 @@ The following steps show how to configure your resource:
66
67
> [!IMPORTANT]
67
68
> Using a project's endpoint requires configuring Microsoft Entra ID in your application. If you don't have Entra ID configured, use the Azure Application Insights connection string as indicated in step 3 of the tutorial.
68
69
70
+
## View trace results in Azure AI Foundry portal
71
+
72
+
Once you have tracing configured and your application is instrumented, you can view trace results in the Azure AI Foundry portal:
73
+
74
+
1. Go to [Azure AI Foundry portal](https://ai.azure.com/?cid=learnDocs) and navigate to your project.
75
+
76
+
1. On the side navigation bar, select **Tracing**.
77
+
78
+
1. You'll see a list of trace results from your instrumented applications. Each trace shows:
79
+
-**Trace ID**: Unique identifier for the trace
80
+
-**Start time**: When the trace began
81
+
-**Duration**: How long the operation took
82
+
-**Status**: Success or failure status
83
+
-**Operations**: Number of spans in the trace
84
+
85
+
1. Select any trace to view detailed trace results including:
86
+
- Complete execution timeline
87
+
- Input and output data for each operation
88
+
- Performance metrics and timing
89
+
- Error details if any occurred
90
+
- Custom attributes and metadata
91
+
69
92
## Instrument the OpenAI SDK
70
93
71
94
When developing with the OpenAI SDK, you can instrument your code so traces are sent to Azure AI Foundry. Follow these steps to instrument your code:
@@ -126,7 +149,7 @@ When developing with the OpenAI SDK, you can instrument your code so traces are
126
149
)
127
150
```
128
151
129
-
1. If you go back to Azure AI Foundry portal, you should see the trace displayed:
152
+
1. If you go back to Azure AI Foundry portal, you can view the trace results:
130
153
131
154
:::image type="content" source="../../media/how-to/develop/trace-application/tracing-display-simple.png" alt-text="A screenshot showing how a simple chat completion request is displayed in the trace." lightbox="../../media/how-to/develop/trace-application/tracing-display-simple.png":::
132
155
@@ -167,7 +190,7 @@ When developing with the OpenAI SDK, you can instrument your code so traces are
167
190
return responses
168
191
```
169
192
170
-
1. Traces look as follows:
193
+
1. Trace results look as follows:
171
194
172
195
:::image type="content" source="../../media/how-to/develop/trace-application/tracing-display-decorator.png" alt-text="A screenshot showing how a method using a decorator is displayed in the trace." lightbox="../../media/how-to/develop/trace-application/tracing-display-decorator.png":::
Copy file name to clipboardExpand all lines: articles/ai-foundry/how-to/evaluate-results.md
+10-8Lines changed: 10 additions & 8 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
-
title: View Evaluation Results in the Azure AI Foundry Portal
2
+
title: See Evaluation Results in Azure AI Foundry Portal
3
3
titleSuffix: Azure AI Foundry
4
-
description: This article provides instructions on how to view evaluation results in the Azure AI Foundry portal.
4
+
description: See and analyze AI model evaluation results in Azure AI Foundry portal. Learn to view performance metrics, compare results, and interpret evaluation data for model optimization.
5
5
ms.service: azure-ai-foundry
6
6
ms.custom:
7
7
- ignite-2023
@@ -12,21 +12,23 @@ ms.date: 05/19/2025
12
12
ms.reviewer: mithigpe
13
13
ms.author: lagayhar
14
14
author: lgayhardt
15
+
ai-usage: ai-assisted
15
16
---
16
17
17
-
# View evaluation results in the Azure AI Foundry portal
18
+
# See evaluation results in the Azure AI Foundry portal
18
19
19
-
You can use the Azure AI Foundry portal evaluation page to visualize and assess your results. You can use it as a control center to optimize, troubleshoot, and select the ideal AI model for your deployment needs. The portal can help you with data-driven decision-making and performance enhancement in your Azure AI Foundry projects. You can access and interpret the results from various sources, including your flow, the playground quick test session, evaluation submission UI, and SDK. You have the flexibility to interact with your results in a way that best suits your workflow and preferences.
20
+
Learn how to see evaluation results in the Azure AI Foundry portal. This guide shows you how to view and interpret AI model evaluation data, performance metrics, and quality assessments. Access evaluation results from flows, playground sessions, and SDKto make data-driven decisions about your AI models.
20
21
21
22
After you visualize your evaluation results, you can dive into a thorough examination. You can view individual results and compare these results across multiple evaluation runs. You can identify trends, patterns, and discrepancies, which helps you gain invaluable insights into the performance of your AI system under various conditions.
22
23
23
24
In this article, you learn how to:
24
25
25
-
- View evaluation result and metrics.
26
-
- Compare evaluation results.
27
-
- Improve performance.
26
+
- See evaluation results and metrics.
27
+
- View detailed evaluation data.
28
+
- Compare evaluation results across runs.
29
+
- Interpret performance indicators.
28
30
29
-
## Find your evaluation results
31
+
## See your evaluation results
30
32
31
33
After you submit your evaluation, you can locate the submitted evaluation run within the run list. Go to the **Evaluation** page.
Copy file name to clipboardExpand all lines: articles/ai-foundry/how-to/fine-tune-managed-compute.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
-
title: Fine-tune models using a managed compute with Azure AI Foundry portal (preview)
2
+
title: Deploy Fine-Tuned Models with Managed Compute in Azure AI Foundry
3
3
titleSuffix: Azure AI Foundry
4
-
description: Learn how to fine-tune models using a managed compute with Azure AI Foundry.
4
+
description: Deploy fine-tuned models using managed compute in Azure AI Foundry portal. Step-by-step guide to fine-tune, train, and deploy custom models with GPU compute resources.
5
5
ms.service: azure-ai-foundry
6
6
ms.topic: how-to
7
7
ms.date: 08/15/2025
@@ -13,15 +13,15 @@ author: ssalgadodev
13
13
ms.custom:
14
14
- references_regions
15
15
- hub-only
16
-
16
+
ai-usage: ai-assisted
17
17
#customer intent: As a data scientist using a managed compute, I want to learn how to fine-tune models to improve model performance for specific tasks.
18
18
---
19
19
20
20
# Fine-tune models using managed compute (preview)
This article explains how to use a managed compute to fine-tune a model in the Azure AI Foundry portal. Fine-tuning involves adapting a pretrained model to a new, related task or domain. When you use a managed compute for fine-tuning, you use your computational resources to adjust training parameters such as learning rate, batch size, and number of training epochs to optimize the model's performance for a specific task.
24
+
Learn how to deploy fine-tuned models using managed compute in Azure AI Foundry portal. This guide walks you through the complete process of fine-tuning and deploying custom models by adjusting training parameters such as learning rate, batch size, and training epochs to optimize performance for your specific use cases.
25
25
26
26
Fine-tuning a pretrained model to use for a related task is more efficient than building a new model, as fine-tuning builds upon the pretrained model's existing knowledge and reduces the time and data needed for training.
Copy file name to clipboardExpand all lines: articles/ai-foundry/how-to/fine-tune-serverless.md
+4-3Lines changed: 4 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
-
title: Fine-tune models using serverless API deployments in Azure AI Foundry portal
2
+
title: Deploy Fine-Tuned Models with Serverless API in Azure AI Foundry
3
3
titleSuffix: Azure AI Foundry
4
-
description: Learn how to fine-tune models deployed via serverless API deployments in Azure AI Foundry.
4
+
description: Deploy fine-tuned models using serverless API in Azure AI Foundry. Complete guide to fine-tune, train, and deploy custom large language models with cost-effective serverless options.
5
5
ms.service: azure-ai-foundry
6
6
ms.topic: how-to
7
7
ms.date: 04/14/2025
@@ -12,13 +12,14 @@ manager: nitinme
12
12
author: ssalgadodev
13
13
ms.custom: references_regions
14
14
zone_pivot_groups: azure-ai-model-fine-tune
15
+
ai-usage: ai-assisted
15
16
---
16
17
17
18
# Fine-tune models using serverless API deployments in Azure AI Foundry
Azure AI Foundry enables you to customize large language models to your specific datasets through a process called fine-tuning. This process offers significant benefits by allowing for customization and optimization tailored to specific tasks and applications. The advantages include improved performance, cost efficiency, reduced latency, and tailored outputs.
22
+
Learn how to deploy fine-tuned models using serverless API deployments in Azure AI Foundry. This comprehensive guide shows you how to fine-tune large language models to your specific datasets and deploy them with serverless infrastructure, offering improved performance, cost efficiency, reduced latency, and tailored outputs.
22
23
23
24
**Cost Efficiency**: Azure AI Foundry's fine-tuning can be more cost-effective, especially for large-scale deployments, thanks to pay-as-you-go pricing.
Copy file name to clipboardExpand all lines: articles/ai-foundry/openai/chatgpt-quickstart.md
+5-4Lines changed: 5 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
-
title: 'Quickstart - Get started using chat completions with Azure OpenAI in Azure AI Foundry Models'
2
+
title: 'Get Answers with Azure OpenAI Chat Completions'
3
3
titleSuffix: Azure OpenAI
4
-
description: Walkthrough on how to get started using chat completions with Azure OpenAI.
4
+
description: Get answers using Azure OpenAI chat completions in Azure AI Foundry Models. Learn how to ask questions and get AI responses with GPT models and API integration.
# Quickstart: Get started using chat completions with Azure OpenAI in Azure AI Foundry Models
17
+
# Quickstart: Get answers using Azure OpenAI chat completions
17
18
18
-
Use this article to get started using Azure OpenAI.
19
+
Learn how to get answers using Azure OpenAI chat completions. This guide shows you how to ask questions and receive AI-powered responses from GPT models.
Copy file name to clipboardExpand all lines: articles/ai-foundry/openai/how-to/fine-tuning.md
+3-2Lines changed: 3 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
2
title: 'Customize a model with Azure OpenAI in Azure AI Foundry Models'
3
3
titleSuffix: Azure OpenAI
4
-
description: Learn how to create your own customized model with Azure OpenAI by using Python, the REST APIs, or Azure AI Foundry portal.
4
+
description: Learn how to fine-tune and customize Azure OpenAI models using Python, REST APIs, or Azure AI Foundry portal. Improve model performance with LoRA adaptation and custom datasets.
Azure OpenAI in Azure AI Foundry Models lets you tailor our models to your personal datasets by using a process known as *fine-tuning*. This customization step lets you get more out of the service by providing:
18
+
Learn how to fine-tune Azure OpenAI models in Azure AI Foundry Models to customize them for your specific datasets and use cases. Fine-tuning enables you to get more out of the service by providing:
18
19
19
20
- Higher quality results than what you can get just from [prompt engineering](../concepts/prompt-engineering.md)
20
21
- The ability to train on more examples than can fit into a model's max request context limit.
0 commit comments