Skip to content

Commit 0dcae70

Browse files
committed
fix tabs
1 parent 9826c0f commit 0dcae70

File tree

1 file changed

+7
-11
lines changed

1 file changed

+7
-11
lines changed

articles/ai-studio/how-to/develop/trace-local-sdk.md

Lines changed: 7 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ Tracing with the Azure AI SDK offers enhanced visibility and simplified troubles
3737

3838
- Supported Environments: LTS versions of Node.js
3939

40-
# [C#](#tab/python)
40+
# [C#](#tab/csharp)
4141

4242
- To construct the client library, you need to pass in the endpoint URL. The endpoint URL has the form `https://your-host-name.your-azure-region.inference.ai.azure.com`, where your-host-name is your unique model deployment host name and your-azure-region is the Azure region where the model is deployed (for example, eastus2).
4343
- Depending on your model deployment and authentication preference, you either need a key to authenticate against the service, or Microsoft Entra ID credentials. The key is a 32-character string.
@@ -74,7 +74,7 @@ Install the package `@azure-rest/ai-inference` and Azure ModelClient REST client
7474
npm install @azure-rest/ai-inference
7575
```
7676

77-
# [C#](#tab/python)
77+
# [C#](#tab/csharp)
7878

7979
Install the Azure AI inference client library for .NET with [NuGet](https://aka.ms/azsdk/azure-ai-inference/csharp/package):
8080

@@ -125,8 +125,7 @@ provider.addSpanProcessor(new SimpleSpanProcessor(new ConsoleSpanExporter()));
125125
provider.register();
126126
```
127127

128-
129-
# [C#](#tab/python)
128+
# [C#](#tab/csharp)
130129

131130
Distributed tracing and metrics with OpenTelemetry are supported in Azure AI Inference in experimental mode and could be enabled through either of these steps:
132131

@@ -192,7 +191,7 @@ client.path("/chat/completions").post({
192191

193192
```
194193

195-
# [C#](#tab/python)
194+
# [C#](#tab/csharp)
196195

197196
To configure OpenTelemetry and enable Azure AI Inference tracing follow these steps:
198197

@@ -222,7 +221,7 @@ To configure OpenTelemetry and enable Azure AI Inference tracing follow these st
222221

223222
---
224223

225-
### Tracing Your Own Functions:
224+
### Tracing your own functions
226225

227226
# [Python](#tab/python)
228227

@@ -298,7 +297,7 @@ const getWeatherFunc = (location: string, unit: string): string => {
298297
}
299298
```
300299

301-
# [C#](#tab/python)
300+
# [C#](#tab/csharp)
302301

303302
To trace your own functions, use the OpenTelemetry API to start and end spans around the code you want to trace. Here's an example:
304303

@@ -343,8 +342,6 @@ To learn more, see [OpenTelemetry .NET](https://opentelemetry.io/docs/languages/
343342

344343
To attach user feedback to traces and visualize them in AI Studio using OpenTelemetry's semantic conventions, you can instrument your application enabling tracing and logging user feedback. By correlating feedback traces with their respective chat request traces using the response ID, you can use view and manage these traces in AI studio. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in AI Studio for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
345344

346-
347-
348345
## Related content
349346

350347
# [Python](#tab/python)
@@ -357,12 +354,11 @@ To attach user feedback to traces and visualize them in AI Studio using OpenTele
357354
- [JavaScript samples](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src/telemetry.ts) containing fully runnable JavaScript code for tracing using synchronous and asynchronous clients.
358355
- [JavaScript samples to use Azure AI Project with tracing](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src/telemetryWithToolCall.ts)
359356

360-
# [C#](#tab/python)
357+
# [C#](#tab/csharp)
361358

362359
[C# Samples](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.Inference_1.0.0-beta.2/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md) containing fully runnable C# code for doing inference using synchronous and asynchronous methods.
363360

364361
---
365362

366363
- [Get started building a chat app using the prompt flow SDK](../../quickstarts/get-started-code.md)
367364
- [Work with projects in VS Code](vscode.md)
368-

0 commit comments

Comments
 (0)