Skip to content

Commit 11d129c

Browse files
authored
Merge pull request #3833 from MicrosoftDocs/main
3/31/2025 PM Publish
2 parents 85f4b4f + d782cc5 commit 11d129c

File tree

45 files changed

+1779
-1272
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

45 files changed

+1779
-1272
lines changed

articles/ai-foundry/concepts/models-featured.md

Lines changed: 33 additions & 33 deletions
Large diffs are not rendered by default.

articles/ai-foundry/concepts/trace.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,19 @@
11
---
2-
title: Tracing in Azure AI Inference SDK
2+
title: Trace your application with Azure AI Foundry project library
33
titleSuffix: Azure AI Foundry
4-
description: This article provides an overview of tracing with the Azure AI Inference SDK.
4+
description: This article provides an overview of tracing with the Azure AI Foundry project library.
55
manager: scottpolly
66
ms.service: azure-ai-foundry
77
ms.custom:
88
- ignite-2024
99
ms.topic: conceptual
10-
ms.date: 11/19/2024
10+
ms.date: 03/12/2025
1111
ms.reviewer: truptiparkar
1212
ms.author: lagayhar
1313
author: lgayhardt
1414
---
1515

16-
# Tracing in Azure AI Inference SDK overview
16+
# Trace your application with Azure AI Foundry project library overview
1717

1818
[!INCLUDE [feature-preview](../includes/feature-preview.md)]
1919

@@ -75,5 +75,5 @@ Azure AI's tracing capabilities are designed to empower developers with the tool
7575

7676
## Related content
7777

78-
- [Trace your application with Azure AI Inference SDK](../how-to/develop/trace-local-sdk.md)
78+
- [Trace your application with Azure AI Foundry project library](../how-to/develop/trace-local-sdk.md)
7979
- [Visualize your traces](../how-to/develop/visualize-traces.md)

articles/ai-foundry/how-to/develop/trace-local-sdk.md

Lines changed: 102 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -1,35 +1,67 @@
11
---
22
title: How to trace your application with Azure AI Inference SDK
33
titleSuffix: Azure AI Foundry
4-
description: This article provides instructions on how to trace your application with Azure AI Inference SDK.
4+
description: This article provides instructions on how to trace your application with Azure AI Inference SDK.
55
manager: scottpolly
66
ms.service: azure-ai-foundry
77
ms.custom:
88
- build-2024
99
- ignite-2024
1010
ms.topic: how-to
11-
ms.date: 11/19/2024
11+
ms.date: 03/12/2025
1212
ms.reviewer: truptiparkar
1313
ms.author: lagayhar
1414
author: lgayhardt
1515
---
1616

17-
# How to trace your application with Azure AI Inference SDK
17+
# How to trace your application with Azure AI Foundry project library
1818

1919
[!INCLUDE [feature-preview](../../includes/feature-preview.md)]
2020

21-
In this article you'll learn how to trace your application with Azure AI Inference SDK with your choice between using Python, JavaScript, or C#. The Azure AI Inference client library provides support for tracing with OpenTelemetry.
21+
In this article, you'll learn how to trace your application with Azure AI Foundry SDK with your choice between using Python, JavaScript, or C#. This provides support for tracing with OpenTelemetry.
2222

23-
## Enable trace in your application
24-
25-
### Prerequisites
23+
## Prerequisites
2624

2725
- An [Azure Subscription](https://azure.microsoft.com/).
2826
- An Azure AI project, see [Create a project in Azure AI Foundry portal](../create-projects.md).
2927
- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through Azure AI Foundry.
3028
- If using Python, you need Python 3.8 or later installed, including pip.
3129
- If using JavaScript, the supported environments are LTS versions of Node.js.
3230

31+
## Tracing using Azure AI Foundry project library
32+
33+
# [Python](#tab/python)
34+
35+
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](../create-projects.md) if you don't have one already.
36+
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project in Azure AI Foundry portal and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
37+
38+
Make sure to install following packages via
39+
40+
```bash
41+
pip install opentelemetry-sdk
42+
pip install azure-core-tracing-opentelemetry
43+
pip install azure-monitor-opentelemetry
44+
```
45+
46+
Refer the following samples to get started with tracing using Azure AI Project SDK:
47+
48+
- [Python Sample with console tracing for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
49+
- [Python Sample with Azure Monitor for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
50+
- [Python Sample with console tracing for Azure OpenAI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
51+
- [Python Sample with Azure Monitor for Azure OpenAI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
52+
53+
# [JavaScript](#tab/javascript)
54+
55+
Tracing isn't yet integrated into the Azure AI Projects SDK for JS. For instructions on how to instrument and log traces from the Azure AI Inference package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
56+
57+
# [C#](#tab/csharp)
58+
59+
Tracing isn't yet integrated into the Azure AI Projects SDK for C#. For instructions on how to instrument and log traces from the Azure AI Inference package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
60+
61+
---
62+
63+
## Enable Tracing using Azure AI Inference Library
64+
3365
### Installation
3466

3567
# [Python](#tab/python)
@@ -43,7 +75,7 @@ Install the package `azure-ai-inference` using your package manager, like pip:
4375
Install the Azure Core OpenTelemetry Tracing plugin, OpenTelemetry, and the OTLP exporter for sending telemetry to your observability backend. To install the necessary packages for Python, use the following pip commands:
4476

4577
```bash
46-
pip install opentelemetry
78+
pip install opentelemetry-sdk
4779

4880
pip install opentelemetry-exporter-otlp
4981
```
@@ -72,7 +104,7 @@ To learn more Azure AI Inference SDK for C# and observability, see the [Tracing
72104

73105
---
74106

75-
To learn more , see the [Inference SDK reference](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md).
107+
To learn more, see the [Inference SDK reference](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md).
76108

77109
### Configuration
78110

@@ -81,14 +113,6 @@ To learn more , see the [Inference SDK reference](../../../ai-foundry/model-infe
81113
You need to add following configuration settings as per your use case:
82114

83115
- To capture prompt and completion contents, set the `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment variable to true (case insensitive). By default, prompts, completions, function names, parameters, or outputs aren't recorded.
84-
- To enable Azure SDK tracing, set the `AZURE_SDK_TRACING_IMPLEMENTATION` environment variable to opentelemetry. Alternatively, you can configure it in the code with the following snippet:
85-
86-
```python
87-
from azure.core.settings import settings
88-
89-
settings.tracing_implementation = "opentelemetry"
90-
```
91-
92116
To learn more, see [Azure Core Tracing OpenTelemetry client library for Python](/python/api/overview/azure/core-tracing-opentelemetry-readme).
93117

94118
# [JavaScript](#tab/javascript)
@@ -131,7 +155,7 @@ AIInferenceInstrumentor().instrument()
131155

132156
```
133157

134-
It's also possible to uninstrument the Azure AI Inferencing API by using the uninstrument call. After this call, the traces will no longer be emitted by the Azure AI Inferencing API until instrument is called again:
158+
It's also possible to uninstrument the Azure AI Inference API by using the uninstrument call. After this call, the traces will no longer be emitted by the Azure AI Inference API until instrument is called again:
135159

136160
```python
137161
AIInferenceInstrumentor().uninstrument()
@@ -175,7 +199,7 @@ client.path("/chat/completions").post({
175199

176200
# [C#](#tab/csharp)
177201

178-
To configure OpenTelemetry and enable Azure AI Inference tracing follow these steps:
202+
To configure OpenTelemetry and enable Azure AI Inference tracing, follow these steps:
179203

180204
1. **Install OpenTelemetry Packages**: Install the following dependencies for HTTP tracing and metrics instrumentation as well as console and [OTLP](https://opentelemetry.io/docs/specs/otel/protocol/) exporters:
181205

@@ -207,12 +231,71 @@ To configure OpenTelemetry and enable Azure AI Inference tracing follow these st
207231

208232
To trace your own custom functions, you can leverage OpenTelemetry, you'll need to instrument your code with the OpenTelemetry SDK. This involves setting up a tracer provider and creating spans around the code you want to trace. Each span represents a unit of work and can be nested to form a trace tree. You can add attributes to spans to enrich the trace data with additional context. Once instrumented, configure an exporter to send the trace data to a backend for analysis and visualization. For detailed instructions and advanced usage, refer to the [OpenTelemetry documentation](https://opentelemetry.io/docs/). This will help you monitor the performance of your custom functions and gain insights into their execution.
209233
234+
### Using service name in trace data
235+
236+
To identify your service via a unique ID in Application Insights, you can use the service name OpenTelemetry property in your trace data. This is particularly useful if you're logging data from multiple applications to the same Application Insights resource, and you want to differentiate between them. For example, lets say you have two applications: **App-1** and **App-2**, with tracing configured to log data to the same Application Insights resource. Perhaps you'd like to set up **App-1** to be evaluated continuously by **Relevance** and **App-2** to be evaluated continuously by **Groundedness**. You can use the service name to differentiate between the applications in your Online Evaluation configurations.
237+
238+
To set up the service name property, you can do so directly in your application code by following the steps, see [Using multiple tracer providers with different Resource](https://opentelemetry.io/docs/languages/python/cookbook/#using-multiple-tracer-providers-with-different-resource). Alternatively, you can set the environment variable `OTEL_SERVICE_NAME` prior to deploying your app. To learn more about working with the service name, see [OTEL Environment Variables](https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/#general-sdk-configuration) and [Service Resource Semantic Conventions](https://opentelemetry.io/docs/specs/semconv/resource/#service).
239+
240+
To query trace data for a given service name, query for the `cloud_roleName` property. In case you're leveraging Online Evaluation, add the following line to the KQL query you use within your Online Evaluation set-up:
241+
242+
```sql
243+
| where cloud_RoleName == "service_name"
244+
```
245+
246+
## Enable Tracing for Langchain
247+
248+
# [Python](#tab/python)
249+
250+
You can enable tracing for Langchain that follows OpenTelemetry standards as per [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/) To enable tracing for Langchain, follow following steps:
251+
252+
Install the package `opentelemetry-instrumentation-langchain` using your package manager, like pip:
253+
254+
```bash
255+
pip install opentelemetry-instrumentation-langchain
256+
```
257+
258+
Once necessary packages are installed, you can easily enable tracing via [Tracing using Azure AI Foundry project library](#tracing-using-azure-ai-foundry-project-library)
259+
260+
# [JavaScript](#tab/javascript)
261+
Currently this is supported in Python only.
262+
263+
# [C#](#tab/csharp)
264+
Currently this is supported in Python only.
265+
266+
---
267+
210268
## Attach User feedback to traces
211269

212270
To attach user feedback to traces and visualize them in Azure AI Foundry portal using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
213271

272+
To log user feedback, follow this format:
273+
The user feedback evaluation event can be captured if and only if user provided a reaction to GenAI model response.
274+
It SHOULD, when possible, be parented to the GenAI span describing such response.
275+
276+
<!-- prettier-ignore-start -->
277+
<!-- markdownlint-capture -->
278+
<!-- markdownlint-disable -->
279+
The event name MUST be `gen_ai.evaluation.user_feedback`.
280+
281+
| Attribute | Type | Description | Examples | [Requirement Level](https://opentelemetry.io/docs/specs/semconv/general/attribute-requirement-level/) | Stability |
282+
|---|---|---|---|---|---|
283+
|`gen_ai.response.id`| string | The unique identifier for the completion. | `chatcmpl-123` | `Required` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
284+
| `gen_ai.evaluation.score`| double | Quantified score calculated based on the user reaction in [-1.0, 1.0] range with 0 representing a neutral reaction. | `0.42` | `Recommended` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
285+
286+
<!-- markdownlint-restore -->
287+
<!-- prettier-ignore-end -->
288+
<!-- END AUTOGENERATED TEXT -->
289+
The user feedback event body has the following structure:
290+
291+
| Body Field | Type | Description | Examples | Requirement Level |
292+
|---|---|---|---|---|
293+
| `comment` | string | Additional details about the user feedback | `"I did not like it"` | `Opt-in` |
294+
214295
## Related content
215296

216297
- [Python samples](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
298+
- [Sample Agents with Console tracing](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/agents/sample_agents_functions_with_console_tracing.py)
299+
- [Sample Agents with Azure Monitor](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/agents/sample_agents_basics_with_azure_monitor_tracing.py)
217300
- [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src) containing fully runnable JavaScript code for tracing using synchronous and asynchronous clients.
218301
- [C# Samples](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.Inference_1.0.0-beta.2/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md) containing fully runnable C# code for doing inference using synchronous and asynchronous methods.

articles/ai-services/cognitive-services-container-support.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: aahill
66
manager: nitinme
77
ms.service: azure-ai-services
88
ms.topic: overview
9-
ms.date: 09/25/2024
9+
ms.date: 03/31/2025
1010
ms.author: aahi
1111
keywords: on-premises, Docker, container, Kubernetes
1212
#Customer intent: As a potential customer, I want to know more about how Azure AI services provides and supports Docker containers for each service.

articles/ai-services/custom-vision-service/limits-and-quotas.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,8 @@ There are two tiers of subscription to the Custom Vision service. You can sign u
3434
|Max image height/width in pixels|10,240|10,240|
3535
|Max image size (training image upload) |6 MB|6 MB|
3636
|Max image size (prediction)|4 MB|4 MB|
37-
|Max number of regions per image (object detection)|300|300|
37+
|Max number of regions per image (training) (object detection)|300|300|
38+
|Max number of regions per image (prediction) (object detection)|200|200|
3839
|Max number of tags per image (classification)|100|100|
3940

4041
> [!NOTE]

articles/ai-services/openai/assistants-reference-messages.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Learn how to use Azure OpenAI's Python & REST API messages with Ass
55
manager: nitinme
66
ms.service: azure-ai-openai
77
ms.topic: reference
8-
ms.date: 02/27/2025
8+
ms.date: 03/31/2025
99
author: aahill
1010
ms.author: aahi
1111
recommendations: false
@@ -14,8 +14,6 @@ ms.custom: devx-track-python
1414

1515
# Assistants API (Preview) messages reference
1616

17-
[!INCLUDE [Assistants v2 note](includes/assistants-v2-note.md)]
18-
1917
This article provides reference documentation for Python and REST for the new Assistants API (Preview). More in-depth step-by-step guidance is provided in the [getting started guide](./how-to/assistant.md).
2018

2119
## Create message

articles/ai-services/openai/assistants-reference-runs.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Learn how to use Azure OpenAI's Python & REST API runs with Assista
55
manager: nitinme
66
ms.service: azure-ai-openai
77
ms.topic: reference
8-
ms.date: 09/17/2024
8+
ms.date: 03/31/2025
99
author: aahill
1010
ms.author: aahi
1111
recommendations: false
@@ -14,8 +14,6 @@ ms.custom: devx-track-python
1414

1515
# Assistants API (Preview) runs reference
1616

17-
[!INCLUDE [Assistants v2 note](includes/assistants-v2-note.md)]
18-
1917
This article provides reference documentation for Python and REST for the new Assistants API (Preview). More in-depth step-by-step guidance is provided in the [getting started guide](./how-to/assistant.md).
2018

2119
## Create run

articles/ai-services/openai/assistants-reference-threads.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ description: Learn how to use Azure OpenAI's Python & REST API threads with Assi
55
manager: nitinme
66
ms.service: azure-ai-openai
77
ms.topic: reference
8-
ms.date: 02/27/2025
8+
ms.date: 03/31/2025
99
author: aahill
1010
ms.author: aahi
1111
recommendations: false
@@ -14,8 +14,6 @@ ms.custom: devx-track-python
1414

1515
# Assistants API (Preview) threads reference
1616

17-
[!INCLUDE [Assistants v2 note](includes/assistants-v2-note.md)]
18-
1917
This article provides reference documentation for Python and REST for the new Assistants API (Preview). More in-depth step-by-step guidance is provided in the [getting started guide](./how-to/assistant.md).
2018

2119
## Create a thread

articles/ai-services/openai/assistants-reference.md

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -14,9 +14,6 @@ ms.custom: devx-track-python
1414

1515
# Assistants API (Preview) reference
1616

17-
18-
[!INCLUDE [Assistants v2 note](includes/assistants-v2-note.md)]
19-
2017
This article provides reference documentation for Python and REST for the new Assistants API (Preview). More in-depth step-by-step guidance is provided in the [getting started guide](./how-to/assistant.md).
2118

2219
## Create an assistant

0 commit comments

Comments
 (0)