You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/develop/trace-local-sdk.md
+30-5Lines changed: 30 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -31,7 +31,7 @@ In this article you'll learn how to trace your application with Azure AI Foundry
31
31
## Tracing using Azure AI Foundry Project
32
32
# [Python](#tab/python)
33
33
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/create-projects?tabs=ai-studio) if you don't have one already.
34
-
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
34
+
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project in Azure AI Foundry portal and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
Tracing is not yet integrated into the projects package. For instructions on how to instrument and log traces from the Azure AI Inferencing package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
65
+
Tracing is not yet integrated into the Azure AI Projects SDK for JS. For instructions on how to instrument and log traces from the Azure AI Inference package, see [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src).
66
66
67
67
# [C#](#tab/csharp)
68
68
69
-
Tracing is not yet integrated into the projects package. For instructions on how to instrument and log traces from the Azure AI Inferencing package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
69
+
Tracing is not yet integrated into the Azure AI Projects SDK for C#. For instructions on how to instrument and log traces from the Azure AI Inference package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
70
70
71
71
----
72
72
@@ -92,7 +92,7 @@ Currently this is supported in Python only.
It's also possible to uninstrument the Azure AI Inferencing API by using the uninstrument call. After this call, the traces will no longer be emitted by the Azure AI Inferencing API until instrument is called again:
198
+
It's also possible to uninstrument the Azure AI Inference API by using the uninstrument call. After this call, the traces will no longer be emitted by the Azure AI Inference API until instrument is called again:
199
199
200
200
```python
201
201
AIInferenceInstrumentor().uninstrument()
@@ -306,7 +306,32 @@ Currently this is supported in Python only.
306
306
307
307
## Attach User feedback to traces
308
308
309
+
309
310
To attach user feedback to traces and visualize them in Azure AI Foundry portal using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
311
+
Please follow the following format to log user feedback:
312
+
The user feedback evaluation event can be captured ifand only if user provided a reaction to GenAI model response.
313
+
It SHOULD, when possible, be parented to the GenAI span describing such response.
314
+
315
+
<!-- prettier-ignore-start -->
316
+
<!-- markdownlint-capture -->
317
+
<!-- markdownlint-disable -->
318
+
The event name MUST be `gen_ai.evaluation.user_feedback`.
| [`gen_ai.response.id`](/docs/attributes-registry/gen-ai.md) | string | The unique identifier for the completion. |`chatcmpl-123`|`Required`|  |
323
+
| [`gen_ai.evaluation.score`](/docs/attributes-registry/gen-ai.md) | double | Quantified score calculated based on the user reaction in [-1.0, 1.0] rangewith0 representing a neutral reaction. |`0.42`|`Recommended`|  |
324
+
325
+
326
+
<!-- markdownlint-restore -->
327
+
<!-- prettier-ignore-end -->
328
+
<!--ENDAUTOGENERATEDTEXT-->
329
+
330
+
The user feedback event body has the following structure:
331
+
332
+
| Body Field | Type | Description | Examples | Requirement Level |
333
+
|---|---|---|---|---|
334
+
|`comment`| string | Additional details about the user feedback |`"I did not like it"`|`Opt-in`|
0 commit comments