Skip to content

Commit b819c8e

Browse files
committed
Update to tracing concept and how to
1 parent db88148 commit b819c8e

File tree

2 files changed

+27
-35
lines changed

2 files changed

+27
-35
lines changed

articles/ai-foundry/concepts/trace.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.service: azure-ai-foundry
77
ms.custom:
88
- ignite-2024
99
ms.topic: conceptual
10-
ms.date: 11/19/2024
10+
ms.date: 02/28/2025
1111
ms.reviewer: truptiparkar
1212
ms.author: lagayhar
1313
author: lgayhardt

articles/ai-foundry/how-to/develop/trace-local-sdk.md

Lines changed: 26 additions & 34 deletions
Original file line numberDiff line numberDiff line change
@@ -8,72 +8,63 @@ ms.custom:
88
- build-2024
99
- ignite-2024
1010
ms.topic: how-to
11-
ms.date: 11/19/2024
11+
ms.date: 02/28/2025
1212
ms.reviewer: truptiparkar
1313
ms.author: lagayhar
1414
author: lgayhardt
1515
---
1616

17-
# How to trace your application with Azure AI Foundry Project Library
17+
# How to trace your application with Azure AI Foundry Project Library
1818

1919
[!INCLUDE [feature-preview](../../includes/feature-preview.md)]
2020

21-
In this article you'll learn how to trace your application with Azure AI Foundry SDK with your choice between using Python, JavaScript, or C#. This provides support for tracing with OpenTelemetry.
21+
In this article, you'll learn how to trace your application with Azure AI Foundry SDK with your choice between using Python, JavaScript, or C#. This provides support for tracing with OpenTelemetry.
2222

23-
### Prerequisites
23+
## Prerequisites
2424

2525
- An [Azure Subscription](https://azure.microsoft.com/).
26-
- An Azure AI project, see [Create a project in Azure AI Foundry portal](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/create-projects?tabs=ai-studio).
26+
- An Azure AI project, see [Create a project in Azure AI Foundry portal](../create-projects.md).
2727
- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through Azure AI Foundry.
2828
- If using Python, you need Python 3.8 or later installed, including pip.
2929
- If using JavaScript, the supported environments are LTS versions of Node.js.
3030

31-
## Tracing using Azure AI Foundry Project Library
31+
## Tracing using Azure AI Foundry Project Library
32+
3233
# [Python](#tab/python)
33-
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/create-projects?tabs=ai-studio) if you don't have one already.
34-
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project in Azure AI Foundry portal and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
34+
35+
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](../create-projects.md) if you don't have one already.
36+
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project in Azure AI Foundry portal and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
3537

3638
Make sure to install following packages via
3739

38-
```
40+
```bash
3941
pip install opentelemetry-sdk
4042
pip install azure-core-tracing-opentelemetry
4143
pip install azure-monitor-opentelemetry
4244
```
4345

4446
Refer the following samples to get started with tracing using Azure AI Project SDK:
47+
4548
- [Python Sample with console tracing for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
4649
- [Python Sample with Azure Monitor for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
47-
- [Python Sample with console tracing for Azure Open AI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
48-
- [Python Sample with Azure Monitor for Azure Open AI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
50+
- [Python Sample with console tracing for Azure OpenAI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
51+
- [Python Sample with Azure Monitor for Azure OpenAI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
4952

5053
# [JavaScript](#tab/javascript)
5154

52-
Tracing is not yet integrated into the Azure AI Projects SDK for JS. For instructions on how to instrument and log traces from the Azure AI Inference package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
55+
Tracing isn't yet integrated into the Azure AI Projects SDK for JS. For instructions on how to instrument and log traces from the Azure AI Inference package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
5356

5457
# [C#](#tab/csharp)
5558

56-
Tracing is not yet integrated into the Azure AI Projects SDK for C#. For instructions on how to instrument and log traces from the Azure AI Inference package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
57-
58-
----
59-
60-
\
61-
Refer the following samples to get started with tracing using Azure AI Project Library for Azure OpenAI:
62-
63-
64-
### [JavaScript](#tab/javascript)
65-
Currently this is supported in Python only.
66-
67-
### [C#](#tab/csharp)
68-
Currently this is supported in Python only.
59+
Tracing isn't yet integrated into the Azure AI Projects SDK for C#. For instructions on how to instrument and log traces from the Azure AI Inference package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
6960

70-
----
61+
---
7162

7263
## Enable Tracing using Azure AI Inference Library
7364

7465
### Installation
7566

76-
### [Python](#tab/python)
67+
# [Python](#tab/python)
7768

7869
Install the package `azure-ai-inference` using your package manager, like pip:
7970

@@ -113,7 +104,7 @@ To learn more Azure AI Inference SDK for C# and observability, see the [Tracing
113104

114105
---
115106

116-
To learn more , see the [Inference SDK reference](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md).
107+
To learn more, see the [Inference SDK reference](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md).
117108

118109
### Configuration
119110

@@ -208,7 +199,7 @@ client.path("/chat/completions").post({
208199

209200
# [C#](#tab/csharp)
210201

211-
To configure OpenTelemetry and enable Azure AI Inference tracing follow these steps:
202+
To configure OpenTelemetry and enable Azure AI Inference tracing, follow these steps:
212203

213204
1. **Install OpenTelemetry Packages**: Install the following dependencies for HTTP tracing and metrics instrumentation as well as console and [OTLP](https://opentelemetry.io/docs/specs/otel/protocol/) exporters:
214205

@@ -246,7 +237,7 @@ To identify your service via a unique ID in Application Insights, you can use th
246237

247238
To set up the service name property, you can do so directly in your application code by following the steps, see [Using multiple tracer providers with different Resource](https://opentelemetry.io/docs/languages/python/cookbook/#using-multiple-tracer-providers-with-different-resource). Alternatively, you can set the environment variable `OTEL_SERVICE_NAME` prior to deploying your app. To learn more about working with the service name, see [OTEL Environment Variables](https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/#general-sdk-configuration) and [Service Resource Semantic Conventions](https://opentelemetry.io/docs/specs/semconv/resource/#service).
248239
249-
To query trace data for a given service name, query for the `cloud_roleName` property. In case you are leveraging Online Evaluation, add the following line to the KQL query you use within your Online Evaluation set-up:
240+
To query trace data for a given service name, query for the `cloud_roleName` property. In case you're leveraging Online Evaluation, add the following line to the KQL query you use within your Online Evaluation set-up:
250241

251242
```sql
252243
| where cloud_RoleName == "service_name"
@@ -255,7 +246,8 @@ To query trace data for a given service name, query for the `cloud_roleName` pro
255246
## Enable Tracing for Langchain
256247

257248
# [Python](#tab/python)
258-
You can enable tracing for Langchain that follows Opentelemetry standards as per [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/) To enable tracing for Langchain, follow following steps:
249+
250+
You can enable tracing for Langchain that follows OpenTelemetry standards as per [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/) To enable tracing for Langchain, follow following steps:
259251
260252
Install the package `opentelemetry-instrumentation-langchain` using your package manager, like pip:
261253

@@ -271,13 +263,13 @@ Currently this is supported in Python only.
271263
# [C#](#tab/csharp)
272264
Currently this is supported in Python only.
273265

274-
----
266+
---
275267

276268
## Attach User feedback to traces
277269

278-
279270
To attach user feedback to traces and visualize them in Azure AI Foundry portal using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
280-
Please follow the following format to log user feedback:
271+
272+
To log user feedback, follow this format:
281273
The user feedback evaluation event can be captured if and only if user provided a reaction to GenAI model response.
282274
It SHOULD, when possible, be parented to the GenAI span describing such response.
283275

0 commit comments

Comments
 (0)