Skip to content

Commit c907e77

Browse files
authored
Merge pull request #5030 from MicrosoftDocs/main
5/19/2025 AM Publish
2 parents 448fa79 + ed36f6d commit c907e77

File tree

85 files changed

+3696
-1757
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

85 files changed

+3696
-1757
lines changed

articles/ai-foundry/how-to/create-projects.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -176,7 +176,7 @@ In addition, many resources are only accessible by users in your project workspa
176176
177177
## Related content
178178

179-
- [Deploy an enterprise chat web app](../tutorials/deploy-chat-web-app.md)
179+
- [Quickstart: Get started with Azure AI Foundry](../quickstarts/get-started-code.md?pivots=hub-project)
180180

181181
- [Learn more about Azure AI Foundry](../what-is-azure-ai-foundry.md)
182182

articles/ai-foundry/how-to/develop/trace-application.md

Lines changed: 67 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -57,38 +57,37 @@ To view traces in Azure AI Foundry, you need to connect an Application Insights
5757
To trace the content of chat messages, set the `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment variable to true (case insensitive). Keep in mind this might contain personal data. To learn more, see [Azure Core Tracing OpenTelemetry client library for Python](/python/api/overview/azure/core-tracing-opentelemetry-readme).
5858

5959
```python
60-
from opentelemetry import trace
61-
from azure.monitor.opentelemetry import configure_azure_monitor
62-
from azure.ai.projects import AIProjectClient
63-
from azure.identity import DefaultAzureCredential
6460
import os
65-
6661
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true" # False by default
62+
```
63+
Let's begin instrumenting our agent with OpenTelemetry tracing, by starting off with authenticating and connecting to your Azure AI Project using the `AIProjectClient`.
6764

65+
```python
66+
from azure.ai.projects import AIProjectClient
67+
from azure.identity import DefaultAzureCredential
6868
project_client = AIProjectClient.from_connection_string(
6969
credential=DefaultAzureCredential(),
70-
conn_str=os.environ["PROJECT_CONNECTION_STRING"],
70+
endpoint=os.environ["PROJECT_ENDPOINT"],
7171
)
7272
```
7373

74-
### Log to Azure Monitor Application Insights
75-
76-
Retrieve the connection string from the Application Insights resource connected to your project and set up the OTLP exporters to send telemetry into Azure Monitor.
74+
Next, retrieve the connection string from the Application Insights resource connected to your project and set up the OTLP exporters to send telemetry into Azure Monitor.
7775

7876
```python
77+
from azure.monitor.opentelemetry import configure_azure_monitor
7978
connection_string = project_client.telemetry.get_connection_string()
8079

8180
if not connection_string:
82-
print("Application Insights is not enabled. Enable by going to Observability > Traces in your AI Foundry project.")
81+
print("Application Insights is not enabled. Enable by going to Tracing in your Azure AI Foundry project.")
8382
exit()
8483

85-
configure_azure_monitor(connection_string=connection_string)
84+
configure_azure_monitor(connection_string=connection_string) #enable telemetry collection
8685
```
8786

88-
Start collecting telemetry and send to your project's connected Application Insights resource.
87+
Now, trace your code where you create and execute your agent and user message in your Azure AI Project, so you can see detailed steps for troubleshooting or monitoring.
8988

9089
```python
91-
# Start tracing
90+
from opentelemetry import trace
9291
tracer = trace.get_tracer(__name__)
9392

9493
with tracer.start_as_current_span("example-tracing"):
@@ -104,23 +103,59 @@ with tracer.start_as_current_span("example-tracing"):
104103
run = project_client.agents.create_run(thread_id=thread.id, agent_id=agent.id)
105104
```
106105

107-
### Log to a local OTLP endpoint
106+
After running your agent, you can go begin to [view traces in Azure AI Foundry Portal](#view-traces-in-azure-ai-foundry-portal).
107+
108+
### Log traces locally
108109

109-
To connect to Aspire Dashboard or another OpenTelemetry compatible backend, install the OpenTelemetry Protocol (OTLP) exporter. This enables you to print traces to the console or use a local viewer such as Aspire Dashboard.
110+
To connect to [Aspire Dashboard](https://aspiredashboard.com/#start) or another OpenTelemetry compatible backend, install the OpenTelemetry Protocol (OTLP) exporter. This enables you to print traces to the console or use a local viewer such as Aspire Dashboard.
110111

111112
```bash
112-
pip install opentelemetry-exporter-otlp
113+
pip install azure-core-tracing-opentelemetry opentelemetry-exporter-otlp opentelemetry-sdk
113114
```
115+
Next, you want to configure tracing for your application.
114116

115117
```python
116-
# Enable console tracing
117-
project_client.telemetry.enable(destination=sys.stdout)
118+
from azure.core.settings import settings
119+
settings.tracing_implementation = "opentelemetry"
118120

119-
# for local OTLP endpoint, change the destination to
120-
# project_client.telemetry.enable(destination="http://localhost:4317")
121+
from opentelemetry import trace
122+
from opentelemetry.sdk.trace import TracerProvider
123+
from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter
124+
125+
# Setup tracing to console
126+
span_exporter = ConsoleSpanExporter()
127+
tracer_provider = TracerProvider()
128+
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter))
129+
trace.set_tracer_provider(tracer_provider)
121130
```
131+
Use `enable_telemetry` to begin collecting telemetry.
122132

123-
### Trace custom functions
133+
```python
134+
from azure.ai.projects import enable_telemetry
135+
enable_telemetry(destination=sys.stdout)
136+
137+
# Logging to an OTLP endpoint, change the destination to
138+
# enable_telemetry(destination="http://localhost:4317")
139+
```
140+
```python
141+
# Start tracing
142+
from opentelemetry import trace
143+
tracer = trace.get_tracer(__name__)
144+
145+
with tracer.start_as_current_span("example-tracing"):
146+
agent = project_client.agents.create_agent(
147+
model=os.environ["MODEL_DEPLOYMENT_NAME"],
148+
name="my-assistant",
149+
instructions="You are a helpful assistant"
150+
)
151+
thread = project_client.agents.create_thread()
152+
message = project_client.agents.create_message(
153+
thread_id=thread.id, role="user", content="Tell me a joke"
154+
)
155+
run = project_client.agents.create_run(thread_id=thread.id, agent_id=agent.id)
156+
```
157+
158+
## Trace custom functions
124159

125160
To trace your custom functions, use the OpenTelemetry SDK to instrument your code.
126161

@@ -149,34 +184,26 @@ custom_function()
149184

150185
For detailed instructions and advanced usage, refer to the [OpenTelemetry documentation](https://opentelemetry.io/docs/).
151186

152-
### Attach user feedback to traces
187+
## Attach user feedback to traces
153188

154-
To attach user feedback to traces and visualize it in the Azure AI Foundry portal, you can instrument your application to enable tracing and log user feedback using OpenTelemetry's semantic conventions. By correlating feedback traces with their respective chat request traces using the response ID, you can view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
189+
To attach user feedback to traces and visualize it in the Azure AI Foundry portal, you can instrument your application to enable tracing and log user feedback using OpenTelemetry's semantic conventions.
155190

156-
To log user feedback, follow this format:
157191

158-
The user feedback evaluation event can be captured if and only if the user provided a reaction to the GenAI model response. It SHOULD, when possible, be parented to the GenAI span describing such response.
159192

160-
<!-- prettier-ignore-start -->
161-
<!-- markdownlint-capture -->
162-
<!-- markdownlint-disable -->
163-
The event name MUST be `gen_ai.evaluation.user_feedback`.
193+
By correlating feedback traces with their respective chat request traces using the response ID or thread ID, you can view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
194+
195+
To log user feedback, follow this format:
164196

165-
| Attribute | Type | Description | Examples | [Requirement Level](https://opentelemetry.io/docs/specs/semconv/general/attribute-requirement-level/) | Stability |
166-
|---|---|---|---|---|---|
167-
|`gen_ai.response.id`| string | The unique identifier for the completion. | `chatcmpl-123` | `Required` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
168-
| `gen_ai.evaluation.score`| double | Quantified score calculated based on the user reaction in [-1.0, 1.0] range with 0 representing a neutral reaction. | `0.42` | `Recommended` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
197+
The user feedback evaluation event can be captured if and only if the user provided a reaction to the GenAI model response. It SHOULD, when possible, be parented to the GenAI span describing such response.
169198

170-
<!-- markdownlint-restore -->
171-
<!-- prettier-ignore-end -->
172199

173200
The user feedback event body has the following structure:
174201

175202
| Body Field | Type | Description | Examples | Requirement Level |
176203
|---|---|---|---|---|
177204
| `comment` | string | Additional details about the user feedback | `"I did not like it"` | `Opt-in` |
178205

179-
### Using service name in trace data
206+
## Using service name in trace data
180207

181208
To identify your service via a unique ID in Application Insights, you can use the service name OpenTelemetry property in your trace data. This is useful if you're logging data from multiple applications to the same Application Insights resource, and you want to differentiate between them.
182209

@@ -190,7 +217,7 @@ To query trace data for a given service name, query for the `cloud_roleName` pro
190217
| where cloud_RoleName == "service_name"
191218
```
192219

193-
## Enable Tracing for Langchain
220+
## Enable tracing for Langchain
194221

195222
You can enable tracing for Langchain that follows OpenTelemetry standards as per [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/). To enable tracing for Langchain, install the package `opentelemetry-instrumentation-langchain` using your package manager, like pip:
196223

@@ -200,25 +227,13 @@ pip install opentelemetry-instrumentation-langchain
200227

201228
Once necessary packages are installed, you can easily begin to [Instrument tracing in your code](#instrument-tracing-in-your-code).
202229

203-
## Visualize your traces
204-
205-
### View your traces for local debugging
206-
207-
#### Prompty
208-
209-
Using Prompty, you can trace your application with **Open Telemetry**, which offers enhanced visibility and simplified troubleshooting for LLM-based applications. This method adheres to the OpenTelemetry specification, enabling the capture and visualization of an AI application's internal execution details, which improves debugging and enhances the development process. To learn more, see [Debugging Prompty](https://prompty.ai/docs/getting-started/debugging-prompty).
210-
211-
#### Aspire Dashboard
212-
213-
Aspire Dashboard is a free & open-source OpenTelemetry dashboard for deep insights into your apps on your local development machine. To learn more, see [Aspire Dashboard](https://aspiredashboard.com/#start).
214-
215-
### Debugging with traces in Azure AI Foundry portal
230+
## View traces in Azure AI Foundry portal
216231

217232
In your project, go to `Tracing` to filter your traces as you see fit.
218233

219-
By selecting a trace, I can step through each span and identify issues while observing how my application is responding.
234+
By selecting a trace, you can step through each span and identify issues while observing how your application is responding. This can help you debug and pinpoint issues in your application.
220235

221-
### View traces in Azure Monitor
236+
## View traces in Azure Monitor
222237

223238
If you logged traces using the previous code snippet, then you're all set to view your traces in Azure Monitor Application Insights. You can open in Application Insights from **Manage data source** and use the **End-to-end transaction details view** to further investigate.
224239

articles/ai-foundry/includes/create-project-fdp.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,11 @@ ms.custom: include
1818
* This project type gives you the best support for:
1919

2020
* Agents
21-
* Azure OpenAI models
22-
* Model inferencing
21+
* AI Model Inference including Azure Open AI
2322
* AI Foundry API that works with agents and across models
24-
* Upload files without needing your own Azure Storage account
23+
* Project files (directly upload files and start experimenting)
2524
* Evaluations
25+
* Fine-tuning
2626
* Playgrounds
2727

2828
## Prerequisites
@@ -209,5 +209,5 @@ az cognitiveservices account connection show --name {my_project_name} --resource
209209

210210
## Related content
211211

212-
- [Deploy an enterprise chat web app](../tutorials/deploy-chat-web-app.md)
212+
- [Quickstart: Get started with Azure AI Foundry](../quickstarts/get-started-code.md?pivots=fdp-project)
213213
- [Learn more about Azure AI Foundry](../what-is-azure-ai-foundry.md)

articles/ai-foundry/includes/create-second-fdp-project.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -66,13 +66,14 @@ Your first project (default project) plays a special role and has access to more
6666

6767
# [Azure CLI](#tab/azurecli)
6868

69-
Use your existing values for {my_resource_group} and {foundry_resource_name} to add another project to the resource:
69+
<!-- Use your existing values for {my_resource_group} and {foundry_resource_name} to add another project to the resource:
7070

7171
```azurecli
7272
az cognitiveservices account project create --resource-group {my_resource_group} --name {my_project_name} --account-name {foundry_resource_name}
7373
```
74-
75-
74+
-->
75+
CLI comannds not currently available for creating a [!INCLUDE [fdp-project-name](fdp-project-name.md)].
76+
7677
---
7778

7879
* If you delete your Foundry resource's default project, the next project created will become the default project.

articles/ai-foundry/quickstarts/get-started-code.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ titleSuffix: Azure AI Foundry
44
description: This article provides instructions on how to start using the Azure AI Foundry portal and the Azure AI Foundry SDK.
55
manager: scottpolly
66
ms.service: azure-ai-foundry
7-
ms.custom: build-2024, devx-track-azurecli, devx-track-python, ignite-2024, update-code3
7+
ms.custom: build-2024, devx-track-azurecli, devx-track-python, ignite-2024, update-code4
88
ms.topic: how-to
99
ms.date: 05/12/2025
1010
ms.reviewer: dantaylo

articles/ai-foundry/toc.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -466,6 +466,8 @@ items:
466466
- name: Vision fine-tuning
467467
href: ../ai-services/openai/how-to/fine-tuning-vision.md?context=/azure/ai-foundry/context/context
468468
displayName: finetuning, fine-tuning
469+
- name: Reinforcement fine-tuning
470+
href: ../ai-services/openai/how-to/reinforcement-fine-tuning.md?context=/azure/ai-foundry/context/context
469471
- name: Preference fine-tuning
470472
href: ../ai-services/openai/how-to/fine-tuning-direct-preference-optimization.md?context=/azure/ai-foundry/context/context
471473
displayName: finetuning, fine-tuning

articles/ai-foundry/tutorials/screen-reader.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -89,7 +89,7 @@ The chat session pane is where you can chat to the model and test out your assis
8989

9090
Azure AI Foundry has two different project types - see [What is Azure AI Foundry?](../what-is-azure-ai-foundry.md#project-types). The type appears in the **Type** column in the **All resources** view. In the recent resources picker, the type is in a second line under the project name.
9191

92-
- Listen for **(AI Services)** for a [!INCLUDE [fdp-project-name](../includes/fdp-project-name.md)].
92+
- Listen for either **(AI Foundry)** or **Foundry project** for a [!INCLUDE [fdp-project-name](../includes/fdp-project-name.md)].
9393
- Listen for **(Hub)** for a [!INCLUDE [hub-project-name](../includes/hub-project-name.md)].
9494

9595

articles/ai-foundry/what-is-azure-ai-foundry.md

Lines changed: 7 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -51,15 +51,14 @@ This table summarizes features available in the two project types:
5151
| Capability | [!INCLUDE [fdp](includes/fdp-project-name.md)] | [!INCLUDE[hub](includes/hub-project-name.md)] |
5252
| --- | --- | --- |
5353
| Agents | ✅ (GA) | ✅ (Preview only) |
54-
| Azure OpenAI models || |
55-
| Model inferencing || |
56-
| AI Foundry API that works with agents and across models || |
57-
| Common filestore || |
58-
| Project-level isolation of files and outputs |||
54+
| Azure OpenAI models | ✅ (Native support) | Available via connections |
55+
| Project files (directly upload files and start experimenting) || |
56+
| Project-level isolation of files and outputs |||
5957
| Evaluations |||
6058
| Playground |||
6159
| Prompt flow | ||
6260
| Managed compute | ||
61+
| Required Azure dependencies | - | Azure Storage account, Azure Key Vault |
6362

6463
## Navigate in the Azure AI Foundry portal
6564

@@ -70,7 +69,7 @@ The left pane is organized around your goals. Generally, as you develop with Azu
7069
* **Define and explore**. In this stage you define your project goals, and then explore and test models and services against your use case to find the ones that enable you to achieve your goals.
7170
* **Build and customize**. In this stage, you're actively building solutions and applications with the models, tools, and capabilities you selected. You can also customize models to perform better for your use case by fine-tuning, grounding in your data, and more. Building and customizing might be something you choose to do in the Azure AI Foundry portal, or through code and the Azure AI Foundry SDKs. Either way, a project provides you with everything you need.
7271
* Once you're actively developing in your project, the **Overview** page shows the things you want easy access to, like your endpoints and keys.
73-
* **Assess and improve**. In this stage, you're looking for where you can improve your application's performance. You might choose to use tools like tracing to debug your application or compare evaluations to hone in on how you want your application to behave. You can also integrate with safety & security systems so you can be confident when you take your application to production.
72+
* **Observe and improve**. In this stage, you're looking for where you can improve your application's performance. You might choose to use tools like tracing to debug your application or compare evaluations to hone in on how you want your application to behave. You can also integrate with safety & security systems so you can be confident when you take your application to production.
7473

7574
If you're an admin, or leading a development team, and need to manage the team's resources, project access, quota, and more, you can do that in the Management Center.
7675

@@ -110,13 +109,10 @@ Azure AI Foundry is available in most regions where Azure AI services are availa
110109

111110
You can [explore Azure AI Foundry portal (including the model catalog)](./how-to/model-catalog-overview.md) without signing in.
112111

113-
But for full functionality there are some requirements:
114-
115-
You need an [Azure account](https://azure.microsoft.com/pricing/purchase-options/azure-account).
112+
But for full functionality, you need an [Azure account](https://azure.microsoft.com/pricing/purchase-options/azure-account).
116113

117114
## Related content
118115

119-
- [Quickstart: Use the chat playground in Azure AI Foundry portal](quickstarts/get-started-playground.md)
120-
- [Build a custom chat app in Python using the Azure AI SDK](quickstarts/get-started-code.md)
116+
- [Quickstart: Get started with Azure AI Foundry](quickstarts/get-started-code.md)
121117
- [Create a project](./how-to/create-projects.md)
122118
- [Get started with an AI template](how-to/develop/ai-template-get-started.md)

articles/ai-services/.openpublishing.redirection.ai-services.json

Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1149,6 +1149,22 @@
11491149
"source_path_from_root": "/articles/ai-services/speech-service/how-to-async-meeting-transcription.md",
11501150
"redirect_url": "/azure/ai-services/speech-service/multi-device-conversation",
11511151
"redirect_document_id": false
1152+
},
1153+
{
1154+
"source_path_from_root": "/articles/ai-services/content-understanding/concepts/capabilities.md",
1155+
"redirect_url": "/azure/ai-services/content-understanding/concepts/analyzer-templates",
1156+
"redirect_document_id": true
1157+
},
1158+
{
1159+
"source_path_from_root": "/articles/ai-services/content-understanding/concepts/retrieval-augmented-generation.md",
1160+
"redirect_url": "/azure/ai-services/content-understanding/tutorial/build-rag-solution",
1161+
"redirect_document_id": true
1162+
},
1163+
{
1164+
"source_path_from_root": "/articles/ai-services/content-understanding/concepts/accuracy-confidence.md",
1165+
"redirect_url": "/azure/ai-services/content-understanding/concepts/best-practices",
1166+
"redirect_document_id": true
11521167
}
1168+
11531169
]
11541170
}

0 commit comments

Comments
 (0)