You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-studio/how-to/develop/trace-local-sdk.md
+7-11Lines changed: 7 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,7 +37,7 @@ Tracing with the Azure AI SDK offers enhanced visibility and simplified troubles
37
37
38
38
- Supported Environments: LTS versions of Node.js
39
39
40
-
# [C#](#tab/python)
40
+
# [C#](#tab/csharp)
41
41
42
42
- To construct the client library, you need to pass in the endpoint URL. The endpoint URL has the form `https://your-host-name.your-azure-region.inference.ai.azure.com`, where your-host-name is your unique model deployment host name and your-azure-region is the Azure region where the model is deployed (for example, eastus2).
43
43
- Depending on your model deployment and authentication preference, you either need a key to authenticate against the service, or Microsoft Entra ID credentials. The key is a 32-character string.
@@ -74,7 +74,7 @@ Install the package `@azure-rest/ai-inference` and Azure ModelClient REST client
74
74
npm install @azure-rest/ai-inference
75
75
```
76
76
77
-
# [C#](#tab/python)
77
+
# [C#](#tab/csharp)
78
78
79
79
Install the Azure AI inference client library for .NET with [NuGet](https://aka.ms/azsdk/azure-ai-inference/csharp/package):
Distributed tracing and metrics with OpenTelemetry are supported in Azure AI Inference in experimental mode and could be enabled through either of these steps:
To trace your own functions, use the OpenTelemetry API to start and end spans around the code you want to trace. Here's an example:
304
303
@@ -343,8 +342,6 @@ To learn more, see [OpenTelemetry .NET](https://opentelemetry.io/docs/languages/
343
342
344
343
To attach user feedback to traces and visualize them inAI Studio using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in AI studio. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed inAI Studio for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
345
344
346
-
347
-
348
345
## Related content
349
346
350
347
# [Python](#tab/python)
@@ -357,12 +354,11 @@ To attach user feedback to traces and visualize them in AI Studio using OpenTele
357
354
- [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src/telemetry.ts) containing fully runnable JavaScript code for tracing using synchronous and asynchronous clients.
358
355
- [JavaScript samples to use Azure AI Project with tracing](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src/telemetryWithToolCall.ts)
359
356
360
-
# [C#](#tab/python)
357
+
# [C#](#tab/csharp)
361
358
362
359
[C# Samples](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.Inference_1.0.0-beta.2/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md) containing fully runnable C# code for doing inference using synchronous and asynchronous methods.
363
360
364
361
---
365
362
366
363
- [Get started building a chat app using the prompt flow SDK](../../quickstarts/get-started-code.md)
0 commit comments