You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/develop/trace-local-sdk.md
+9-5Lines changed: 9 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -68,6 +68,8 @@ Tracing is not yet integrated into the projects package. For instructions on how
68
68
69
69
Tracing is not yet integrated into the projects package. For instructions on how to instrument and log traces from the Azure AI Inferencing package, see [azure-sdk-for-dotnet](https://github.com/Azure/azure-sdk-for-net/blob/main/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md).
70
70
71
+
----
72
+
71
73
## Enable Tracing for Azure Open AI
72
74
The Azure OpenAI Service provides access to OpenAI's models including the GPT-4o, GPT-4o mini, GPT-4, GPT-4 Turbo with Vision, DALLE-3, Whisper, and Embeddings model series with the data residency, scalability, safety, security and enterprise capabilities of Azure. Refer this to learn more on how to get started with [Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/develop/sdk-overview?tabs=sync&pivots=programming-language-python#azure-openai-service)
73
75
@@ -80,7 +82,7 @@ Install the package `opentelemetry-instrumentation-openai-v2 2.0b0` using your p
Once necessary packages are installed, you can easily enable tracing via Azure AI Foundry SDK (refer this: ## Tracing using Azure AI Foundry Project)
85
+
Once necessary packages are installed, you can easily enable tracing via [Tracing using Azure AI Foundry Project](#tracing-using-azure-ai-foundry-project)
84
86
85
87
### [JavaScript](#tab/javascript)
86
88
Currently this is supported in Python only.
@@ -283,7 +285,7 @@ To query trace data for a given service name, query for the `cloud_roleName` pro
283
285
284
286
## Enable Tracing for Langchain
285
287
286
-
### [Python](#tab/python)
288
+
# [Python](#tab/python)
287
289
You can enable tracing for Langchain that follows Opentelemetry standards as per [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/) To enable tracing for Langchain, follow following steps:
288
290
289
291
Install the package `opentelemetry-instrumentation-langchain` using your package manager, like pip:
@@ -292,14 +294,16 @@ Install the package `opentelemetry-instrumentation-langchain` using your package
Once necessary packages are installed, you can easily enable tracing via Azure AI Foundry SDK (refer this: ## ) [link text](#Tracing-using-Azure-AI-Foundry-Project)
297
+
Once necessary packages are installed, you can easily enable tracing via [Tracing using Azure AI Foundry Project](#tracing-using-azure-ai-foundry-project)
296
298
297
-
### [JavaScript](#tab/javascript)
299
+
# [JavaScript](#tab/javascript)
298
300
Currently this is supported in Python only.
299
301
300
-
### [C#](#tab/csharp)
302
+
# [C#](#tab/csharp)
301
303
Currently this is supported in Python only.
302
304
305
+
----
306
+
303
307
## Attach User feedback to traces
304
308
305
309
To attach user feedback to traces and visualize them in Azure AI Foundry portal using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
0 commit comments