You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this article you'll learn how to trace your application with Azure AI Foundry SDK with your choice between using Python, JavaScript, or C#. This provides support for tracing with OpenTelemetry.
21
+
In this article, you'll learn how to trace your application with Azure AI Foundry SDK with your choice between using Python, JavaScript, or C#. This provides support for tracing with OpenTelemetry.
22
22
23
-
###Prerequisites
23
+
## Prerequisites
24
24
25
25
- An [Azure Subscription](https://azure.microsoft.com/).
26
-
- An Azure AI project, see [Create a project in Azure AI Foundry portal](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/create-projects?tabs=ai-studio).
26
+
- An Azure AI project, see [Create a project in Azure AI Foundry portal](../create-projects.md).
27
27
- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through Azure AI Foundry.
28
28
- If using Python, you need Python 3.8 or later installed, including pip.
29
29
- If using JavaScript, the supported environments are LTS versions of Node.js.
30
30
31
-
## Tracing using Azure AI Foundry Project Library
31
+
## Tracing using Azure AI Foundry Project Library
32
+
32
33
# [Python](#tab/python)
33
-
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/create-projects?tabs=ai-studio) if you don't have one already.
34
-
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project in Azure AI Foundry portal and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
34
+
35
+
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](../create-projects.md) if you don't have one already.
36
+
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project in Azure AI Foundry portal and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
35
37
36
38
Make sure to install following packages via
37
39
38
-
```
40
+
```bash
39
41
pip install opentelemetry-sdk
40
42
pip install azure-core-tracing-opentelemetry
41
43
pip install azure-monitor-opentelemetry
42
44
```
43
45
44
46
Refer the following samples to get started with tracing using Azure AI Project SDK:
47
+
45
48
-[Python Sample with console tracing for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
46
49
-[Python Sample with Azure Monitor for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
47
-
-[Python Sample with console tracing for Azure Open AI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
48
-
-[Python Sample with Azure Monitor for Azure Open AI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
50
+
-[Python Sample with console tracing for Azure OpenAI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
51
+
-[Python Sample with Azure Monitor for Azure OpenAI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
49
52
50
53
# [JavaScript](#tab/javascript)
51
54
52
-
Tracing is not yet integrated into the Azure AI Projects SDK for JS. For instructions on how to instrument and log traces from the Azure AI Inference package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
55
+
Tracing isn't yet integrated into the Azure AI Projects SDK for JS. For instructions on how to instrument and log traces from the Azure AI Inference package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
53
56
54
57
# [C#](#tab/csharp)
55
58
56
-
Tracing is not yet integrated into the Azure AI Projects SDK for C#. For instructions on how to instrument and log traces from the Azure AI Inference package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
57
-
58
-
----
59
-
60
-
\
61
-
Refer the following samples to get started with tracing using Azure AI Project Library for Azure OpenAI:
62
-
63
-
64
-
### [JavaScript](#tab/javascript)
65
-
Currently this is supported in Python only.
66
-
67
-
### [C#](#tab/csharp)
68
-
Currently this is supported in Python only.
59
+
Tracing isn't yet integrated into the Azure AI Projects SDK for C#. For instructions on how to instrument and log traces from the Azure AI Inference package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
69
60
70
-
----
61
+
---
71
62
72
63
## Enable Tracing using Azure AI Inference Library
73
64
74
65
### Installation
75
66
76
-
###[Python](#tab/python)
67
+
# [Python](#tab/python)
77
68
78
69
Install the package `azure-ai-inference` using your package manager, like pip:
79
70
@@ -113,7 +104,7 @@ To learn more Azure AI Inference SDK for C# and observability, see the [Tracing
113
104
114
105
---
115
106
116
-
To learn more, see the [Inference SDK reference](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md).
107
+
To learn more, see the [Inference SDK reference](../../../ai-foundry/model-inference/reference/reference-model-inference-api.md).
To configure OpenTelemetry and enable Azure AI Inference tracing follow these steps:
202
+
To configure OpenTelemetry and enable Azure AI Inference tracing, follow these steps:
212
203
213
204
1.**Install OpenTelemetry Packages**: Install the following dependencies for HTTP tracing and metrics instrumentation as well as console and [OTLP](https://opentelemetry.io/docs/specs/otel/protocol/) exporters:
214
205
@@ -246,7 +237,7 @@ To identify your service via a unique ID in Application Insights, you can use th
246
237
247
238
Tosetuptheservicenameproperty, youcandosodirectlyinyourapplicationcodebyfollowingthesteps, see [UsingmultipletracerproviderswithdifferentResource](https://opentelemetry.io/docs/languages/python/cookbook/#using-multiple-tracer-providers-with-different-resource). Alternatively, you can set the environment variable `OTEL_SERVICE_NAME` prior to deploying your app. To learn more about working with the service name, see [OTEL Environment Variables](https://opentelemetry.io/docs/specs/otel/configuration/sdk-environment-variables/#general-sdk-configuration) and [Service Resource Semantic Conventions](https://opentelemetry.io/docs/specs/semconv/resource/#service).
Toquerytracedataforagivenservicename, queryforthe `cloud_roleName` property. Incaseyou're leveraging Online Evaluation, add the following line to the KQL query you use within your Online Evaluation set-up:
250
241
251
242
```sql
252
243
|wherecloud_RoleName=="service_name"
@@ -255,7 +246,8 @@ To query trace data for a given service name, query for the `cloud_roleName` pro
255
246
## Enable Tracing for Langchain
256
247
257
248
# [Python](#tab/python)
258
-
YoucanenabletracingforLangchainthatfollowsOpentelemetrystandardsasper [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/) To enable tracing for Langchain, follow following steps:
249
+
250
+
YoucanenabletracingforLangchainthatfollowsOpenTelemetrystandardsasper [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/) To enable tracing for Langchain, follow following steps:
@@ -271,13 +263,13 @@ Currently this is supported in Python only.
271
263
# [C#](#tab/csharp)
272
264
CurrentlythisissupportedinPythononly.
273
265
274
-
----
266
+
---
275
267
276
268
## Attach User feedback to traces
277
269
278
-
279
270
ToattachuserfeedbacktotracesandvisualizetheminAzureAIFoundryportalusingOpenTelemetry'ssemanticconventions, youcaninstrumentyourapplicationenablingtracingandlogginguserfeedback. BycorrelatingfeedbacktraceswiththeirrespectivechatrequesttracesusingtheresponseID, youcanuseviewandmanagethesetracesinAzureAIFoundryportal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
0 commit comments