You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this article you will learn how to trace your application with Azure AI Inference SDK with your choice between using Python, JavaScript, or C#. The Azure AI Inference client library provides experimental support for tracing with OpenTelemetry.
20
+
In this article you'll learn how to trace your application with Azure AI Inference SDK with your choice between using Python, JavaScript, or C#. The Azure AI Inference client library provides support for tracing with OpenTelemetry.
21
21
22
22
## Enable trace in your application
23
23
@@ -26,21 +26,8 @@ In this article you will learn how to trace your application with Azure AI Infer
26
26
- An [Azure Subscription](https://azure.microsoft.com/).
27
27
- An Azure AI project, see [Create a project in Azure AI Studio](../create-projects.md).
28
28
- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through AI Studio.
29
-
30
-
# [Python](#tab/python)
31
-
32
-
- Python 3.8 or later installed, including pip.
33
-
34
-
# [JavaScript](#tab/javascript)
35
-
36
-
- Supported Environments: LTS versions of Node.js
37
-
38
-
# [C#](#tab/csharp)
39
-
40
-
- To construct the client library, you need to pass in the endpoint URL. The endpoint URL has the form `https://your-host-name.your-azure-region.inference.ai.azure.com`, where your-host-name is your unique model deployment host name and your-azure-region is the Azure region where the model is deployed (for example, eastus2).
41
-
- Depending on your model deployment and authentication preference, you either need a key to authenticate against the service, or Microsoft Entra ID credentials. The key is a 32-character string.
42
-
43
-
---
29
+
- If using Python, you need Python 3.8 or later installed, including pip.
30
+
- If using JavaScript, the supported environments are LTS versions of Node.js.
44
31
45
32
### Installation
46
33
@@ -49,32 +36,28 @@ In this article you will learn how to trace your application with Azure AI Infer
49
36
Install the package `azure-ai-inference` using your package manager, like pip:
50
37
51
38
```bash
52
-
pip install azure-ai-inference
39
+
pip install azure-ai-inference[opentelemetry]
53
40
```
54
41
55
42
Install the Azure Core OpenTelemetry Tracing plugin, OpenTelemetry, and the OTLP exporter for sending telemetry to your observability backend. To install the necessary packages for Python, use the following pip commands:
56
43
57
44
```bash
58
-
pip install azure-core-tracing-opentelemetry
59
-
60
45
pip install opentelemetry
61
46
62
-
pip install azure-core-tracing-opentelemetry
63
-
64
47
pip install opentelemetry-exporter-otlp
65
48
```
66
49
67
50
# [JavaScript](#tab/javascript)
68
51
69
-
Install the package `@azure-rest/ai-inference`and Azure ModelClient REST client library for JavaScript using npm:
52
+
Install the package `@azure-rest/ai-inference` for JavaScript using npm:
70
53
71
54
```bash
72
55
npm install @azure-rest/ai-inference
73
56
```
74
57
75
58
# [C#](#tab/csharp)
76
59
77
-
Install the Azure AI inference client library for .NET with [NuGet](https://aka.ms/azsdk/azure-ai-inference/csharp/package):
60
+
Install the Azure AI Inference client library for .NET with [NuGet](https://aka.ms/azsdk/azure-ai-inference/csharp/package):
@@ -91,7 +74,7 @@ To learn more, see the [Inference SDK reference](../../reference/reference-model
91
74
You need to add following configuration settings as per your use case:
92
75
93
76
- To capture prompt and completion contents, set the `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment variable to true (case insensitive). By default, prompts, completions, function names, parameters, or outputs aren't recorded.
94
-
- To enable Azure SDK tracing, set the AZURE_SDK_TRACING_IMPLEMENTATION environment variable to opentelemetry. Alternatively, you can configure it in the code with the following snippet:
77
+
- To enable Azure SDK tracing, set the `AZURE_SDK_TRACING_IMPLEMENTATION` environment variable to opentelemetry. Alternatively, you can configure it in the code with the following snippet:
95
78
96
79
```python
97
80
from azure.core.settings import settings
@@ -101,12 +84,6 @@ You need to add following configuration settings as per your use case:
101
84
102
85
To learn more, see [Azure Core Tracing OpenTelemetry client library for Python](/python/api/overview/azure/core-tracing-opentelemetry-readme).
103
86
104
-
If you want to install Azure AI Inferencing package with support for OpenTelemetry based tracing, use the following command:
105
-
106
-
```bash
107
-
pip install azure-ai-inference[opentelemetry]
108
-
```
109
-
110
87
# [JavaScript](#tab/javascript)
111
88
112
89
Instrumentation is only supported for Chat Completion without streaming. To enable instrumentation, you need to register exporter(s). Following is an example of how to add a console exporter.
@@ -221,120 +198,7 @@ To configure OpenTelemetry and enable Azure AI Inference tracing follow these st
221
198
222
199
### Tracing your own functions
223
200
224
-
# [Python](#tab/python)
225
-
226
-
The `@tracer.start_as_current_span` decorator can be used to trace your own functions. This traces the function parameters and their values. You can also add further attributes to the span in the function implementation as demonstrated in the following example.
227
-
228
-
> [!NOTE]
229
-
> You will have to set up the tracer in your code before using the decorator. To learn more. see [OpenTelemetry Python Documentation](https://opentelemetry.io/docs/languages/python/).
const result = `The temperature in${location} is72 degrees ${unit}`;
286
-
287
-
span.setAttribute("result", result);
288
-
289
-
span.end();
290
-
291
-
return result;
292
-
293
-
});
294
-
295
-
}
296
-
```
297
-
298
-
# [C#](#tab/csharp)
299
-
300
-
To trace your own functions, use the OpenTelemetry API to start and end spans around the code you want to trace. Here's an example:
301
-
302
-
```csharp
303
-
using OpenTelemetry.Trace;
304
-
305
-
var tracer = Sdk.CreateTracerProviderBuilder()
306
-
307
-
.AddSource("sample")
308
-
309
-
.Build()
310
-
311
-
.GetTracer("sample");
312
-
313
-
using (var span= tracer.StartActiveSpan("getWeatherFunc"))
314
-
315
-
{
316
-
var location = "Seattle";
317
-
318
-
var unit = "celsius";
319
-
320
-
if (unit !="celsius")
321
-
322
-
{
323
-
unit = "fahrenheit";
324
-
}
325
-
326
-
var result = $"The temperature in {location} is 72 degrees {unit}";
327
-
328
-
span.SetAttribute("result", result);
329
-
330
-
Console.WriteLine(result);
331
-
332
-
}
333
-
```
334
-
335
-
To learn more, see [OpenTelemetry .NET](https://opentelemetry.io/docs/languages/net/).
336
-
337
-
---
201
+
To trace your own custom functions, you can leverage OpenTelemetry, you'll need to instrument your code with the OpenTelemetry SDK. This involves setting up a tracer provider and creating spans around the code you want to trace. Each span represents a unit of work and can be nested to form a trace tree. You can add attributes to spans to enrich the trace data with additional context. Once instrumented, configure an exporter to send the trace data to a backend for analysis and visualization. For detailed instructions and advanced usage, refer to the [OpenTelemetry documentation](https://opentelemetry.io/docs/). This will help you monitor the performance of your custom functions and gain insights into their execution.
338
202
339
203
## Attach User feedback to traces
340
204
@@ -344,7 +208,7 @@ To attach user feedback to traces and visualize them in AI Studio using OpenTele
344
208
345
209
# [Python](#tab/python)
346
210
347
-
- [Python samples]() containing fully runnable Python code for tracing using synchronous and asynchronous clients.
211
+
- [Python samples](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
348
212
- [Python samples to use Azure AI Project with tracing](https://github.com/Azure/azure-sdk-for-python/tree/feature/azure-ai-projects/sdk/ai/azure-ai-projects/samples/inference)
349
213
350
214
# [JavaScript](#tab/javascript)
@@ -355,8 +219,3 @@ To attach user feedback to traces and visualize them in AI Studio using OpenTele
355
219
# [C#](#tab/csharp)
356
220
357
221
[C# Samples](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.Inference_1.0.0-beta.2/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md) containing fully runnable C# code for doing inference using synchronous and asynchronous methods.
358
-
359
-
---
360
-
361
-
- [Get started building a chat app using the prompt flow SDK](../../quickstarts/get-started-code.md)
0 commit comments