Skip to content

Commit 416e73a

Browse files
Update trace-local-sdk.md
1 parent 1d9ae9a commit 416e73a

File tree

1 file changed

+99
-6
lines changed

1 file changed

+99
-6
lines changed

articles/ai-studio/how-to/develop/trace-local-sdk.md

Lines changed: 99 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -14,13 +14,11 @@ ms.author: lagayhar
1414
author: lgayhardt
1515
---
1616

17-
# How to trace your application with Azure AI Inference SDK
17+
# How to trace your application with Azure AI Foundry SDK
1818

1919
[!INCLUDE [feature-preview](../../includes/feature-preview.md)]
2020

21-
In this article you'll learn how to trace your application with Azure AI Inference SDK with your choice between using Python, JavaScript, or C#. The Azure AI Inference client library provides support for tracing with OpenTelemetry.
22-
23-
## Enable trace in your application
21+
In this article you'll learn how to trace your application with Azure AI Foundry SDK with your choice between using Python, JavaScript, or C#. This provides support for tracing with OpenTelemetry.
2422

2523
### Prerequisites
2624

@@ -30,9 +28,73 @@ In this article you'll learn how to trace your application with Azure AI Inferen
3028
- If using Python, you need Python 3.8 or later installed, including pip.
3129
- If using JavaScript, the supported environments are LTS versions of Node.js.
3230

31+
## Tracing using Azure AI Foundry Project
32+
# [Python](#tab/python)
33+
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](../create-projects.md) if you don't have one already.
34+
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
35+
36+
Make sure to install following packages via
37+
38+
```
39+
pip install opentelemetry-sdk
40+
pip install azure-core-tracing-opentelemetry
41+
pip install azure-monitor-opentelemetry
42+
```
43+
44+
Use the following code to enable instrumentation of the Azure AI Foundry SDK and logging to your AI project:
45+
46+
```Python
47+
from azure.monitor.opentelemetry import configure_azure_monitor
48+
49+
# Enable instrumentation of AI packages (inference, agents, openai, langchain)
50+
project.telemetry.enable()
51+
52+
# Log traces to the project's application insights resource
53+
application_insights_connection_string = project.telemetry.get_connection_string()
54+
if application_insights_connection_string:
55+
configure_azure_monitor(connection_string=application_insights_connection_string)
56+
```
57+
In additional, you might find helpful to see the tracing logs in console. You can achieve by the following code:
58+
59+
```Python
60+
project_client.telemetry.enable(destination=sys.stdout)
61+
```
62+
63+
# [JavaScript](#tab/javascript)
64+
65+
Tracing is not yet integrated into the projects package. For instructions on how to instrument and log traces from the Azure AI Inferencing package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
66+
67+
# [C#](#tab/csharp)
68+
69+
Tracing is not yet integrated into the projects package. For instructions on how to instrument and log traces from the Azure AI Inferencing package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
70+
71+
## Enable Tracing for Azure Open AI
72+
The Azure OpenAI Service provides access to OpenAI's models including the GPT-4o, GPT-4o mini, GPT-4, GPT-4 Turbo with Vision, DALLE-3, Whisper, and Embeddings model series with the data residency, scalability, safety, security and enterprise capabilities of Azure. Refer this to learn more on how to get started with [Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/develop/sdk-overview?tabs=sync&pivots=programming-language-python#azure-openai-service)
73+
74+
### [Python](#tab/python)
75+
Tracing in Azure Open AI follows Opentelemetry standards as per [opentelemetry-instrumentation-openai-v2 2.0b0](https://pypi.org/project/opentelemetry-instrumentation-openai-v2/) To enable tracing for Azure Open AI, follow following steps:
76+
77+
Install the package `opentelemetry-instrumentation-openai-v2 2.0b0` using your package manager, like pip:
78+
79+
```bash
80+
pip install opentelemetry-instrumentation-openai-v2
81+
```
82+
83+
Once necessary packages are installed, you can easily enable tracing via Azure AI Foundry SDK (refer this: ## Tracing using Azure AI Foundry Project)
84+
85+
### [JavaScript](#tab/javascript)
86+
Currently this is supported in Python only.
87+
88+
### [C#](#tab/csharp)
89+
Currently this is supported in Python only.
90+
91+
----
92+
93+
## Tracing using Azure AI Inference SDK
94+
3395
### Installation
3496

35-
# [Python](#tab/python)
97+
### [Python](#tab/python)
3698

3799
Install the package `azure-ai-inference` using your package manager, like pip:
38100

@@ -43,7 +105,7 @@ Install the package `azure-ai-inference` using your package manager, like pip:
43105
Install the Azure Core OpenTelemetry Tracing plugin, OpenTelemetry, and the OTLP exporter for sending telemetry to your observability backend. To install the necessary packages for Python, use the following pip commands:
44106

45107
```bash
46-
pip install opentelemetry
108+
pip install opentelemetry-sdk
47109

48110
pip install opentelemetry-exporter-otlp
49111
```
@@ -207,6 +269,37 @@ To configure OpenTelemetry and enable Azure AI Inference tracing follow these st
207269

208270
To trace your own custom functions, you can leverage OpenTelemetry, you'll need to instrument your code with the OpenTelemetry SDK. This involves setting up a tracer provider and creating spans around the code you want to trace. Each span represents a unit of work and can be nested to form a trace tree. You can add attributes to spans to enrich the trace data with additional context. Once instrumented, configure an exporter to send the trace data to a backend for analysis and visualization. For detailed instructions and advanced usage, refer to the [OpenTelemetry documentation](https://opentelemetry.io/docs/). This will help you monitor the performance of your custom functions and gain insights into their execution.
209271

272+
### Using service name in trace data
273+
274+
To identify your service via a unique ID in Application Insights, you can use the service name OpenTelemetry property in your trace data. This is particularly useful if you're logging data from multiple applications to the same Application Insights resource, and you want to differentiate between them. For example, lets say you have two applications: **App-1** and **App-2**, with tracing configured to log data to the same Application Insights resource. Perhaps you'd like to set up **App-1** to be evaluated continuously by **Relevance** and **App-2** to be evaluated continuously by **Groundedness**. You can use the service name to differentiate between the applications in your Online Evaluation configurations.
275+
276+
To set up the service name property, you can do so directly in your application code by following the steps, see [Using multiple tracer providers with different Resource](https://opentelemetry.io/docs/languages/python/cookbook/#using-multiple-tracer-providers-with-different-resource). Alternatively, you can set the environment variable `OTEL_SERVICE_NAME` prior to deploying your app. To learn more about working with the service name, see [OTEL Environment Variables](https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/#general-sdk-configuration) and [Service Resource Semantic Conventions](https://opentelemetry.io/docs/specs/semconv/resource/#service).
277+
278+
To query trace data for a given service name, query for the `cloud_roleName` property. In case you are leveraging Online Evaluation, add the following line to the KQL query you use within your Online Evaluation set-up:
279+
280+
```sql
281+
| where cloud_RoleName == "service_name"
282+
```
283+
284+
## Enable Tracing for Langchain
285+
286+
### [Python](#tab/python)
287+
You can enable tracing for Langchain that follows Opentelemetry standards as per [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/) To enable tracing for Langchain, follow following steps:
288+
289+
Install the package `opentelemetry-instrumentation-langchain` using your package manager, like pip:
290+
291+
```bash
292+
pip install opentelemetry-instrumentation-langchain
293+
```
294+
295+
Once necessary packages are installed, you can easily enable tracing via Azure AI Foundry SDK (refer this: ## ) [link text](#Tracing-using-Azure-AI-Foundry-Project)
296+
297+
### [JavaScript](#tab/javascript)
298+
Currently this is supported in Python only.
299+
300+
### [C#](#tab/csharp)
301+
Currently this is supported in Python only.
302+
210303
## Attach User feedback to traces
211304

212305
To attach user feedback to traces and visualize them in Azure AI Foundry portal using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.

0 commit comments

Comments
 (0)