Skip to content

Commit 2b41fa6

Browse files
committed
edits
1 parent d8ede2a commit 2b41fa6

File tree

1 file changed

+9
-150
lines changed

1 file changed

+9
-150
lines changed

articles/ai-studio/how-to/develop/trace-local-sdk.md

Lines changed: 9 additions & 150 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ author: lgayhardt
1717

1818
[!INCLUDE [feature-preview](../../includes/feature-preview.md)]
1919

20-
In this article you will learn how to trace your application with Azure AI Inference SDK with your choice between using Python, JavaScript, or C#. The Azure AI Inference client library provides experimental support for tracing with OpenTelemetry.
20+
In this article you'll learn how to trace your application with Azure AI Inference SDK with your choice between using Python, JavaScript, or C#. The Azure AI Inference client library provides support for tracing with OpenTelemetry.
2121

2222
## Enable trace in your application
2323

@@ -26,21 +26,8 @@ In this article you will learn how to trace your application with Azure AI Infer
2626
- An [Azure Subscription](https://azure.microsoft.com/).
2727
- An Azure AI project, see [Create a project in Azure AI Studio](../create-projects.md).
2828
- An AI model supporting the [Azure AI model inference API](https://aka.ms/azureai/modelinference) deployed through AI Studio.
29-
30-
# [Python](#tab/python)
31-
32-
- Python 3.8 or later installed, including pip.
33-
34-
# [JavaScript](#tab/javascript)
35-
36-
- Supported Environments: LTS versions of Node.js
37-
38-
# [C#](#tab/csharp)
39-
40-
- To construct the client library, you need to pass in the endpoint URL. The endpoint URL has the form `https://your-host-name.your-azure-region.inference.ai.azure.com`, where your-host-name is your unique model deployment host name and your-azure-region is the Azure region where the model is deployed (for example, eastus2).
41-
- Depending on your model deployment and authentication preference, you either need a key to authenticate against the service, or Microsoft Entra ID credentials. The key is a 32-character string.
42-
43-
---
29+
- If using Python, you need Python 3.8 or later installed, including pip.
30+
- If using JavaScript, the supported environments are LTS versions of Node.js.
4431

4532
### Installation
4633

@@ -49,32 +36,28 @@ In this article you will learn how to trace your application with Azure AI Infer
4936
Install the package `azure-ai-inference` using your package manager, like pip:
5037

5138
```bash
52-
pip install azure-ai-inference
39+
pip install azure-ai-inference[opentelemetry]
5340
```
5441

5542
Install the Azure Core OpenTelemetry Tracing plugin, OpenTelemetry, and the OTLP exporter for sending telemetry to your observability backend. To install the necessary packages for Python, use the following pip commands:
5643

5744
```bash
58-
pip install azure-core-tracing-opentelemetry
59-
6045
pip install opentelemetry
6146

62-
pip install azure-core-tracing-opentelemetry
63-
6447
pip install opentelemetry-exporter-otlp
6548
```
6649

6750
# [JavaScript](#tab/javascript)
6851

69-
Install the package `@azure-rest/ai-inference` and Azure ModelClient REST client library for JavaScript using npm:
52+
Install the package `@azure-rest/ai-inference` for JavaScript using npm:
7053

7154
```bash
7255
npm install @azure-rest/ai-inference
7356
```
7457

7558
# [C#](#tab/csharp)
7659

77-
Install the Azure AI inference client library for .NET with [NuGet](https://aka.ms/azsdk/azure-ai-inference/csharp/package):
60+
Install the Azure AI Inference client library for .NET with [NuGet](https://aka.ms/azsdk/azure-ai-inference/csharp/package):
7861

7962
```dotnetcli
8063
dotnet add package Azure.AI.Inference --prerelease
@@ -91,7 +74,7 @@ To learn more, see the [Inference SDK reference](../../reference/reference-model
9174
You need to add following configuration settings as per your use case:
9275

9376
- To capture prompt and completion contents, set the `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment variable to true (case insensitive). By default, prompts, completions, function names, parameters, or outputs aren't recorded.
94-
- To enable Azure SDK tracing, set the AZURE_SDK_TRACING_IMPLEMENTATION environment variable to opentelemetry. Alternatively, you can configure it in the code with the following snippet:
77+
- To enable Azure SDK tracing, set the `AZURE_SDK_TRACING_IMPLEMENTATION` environment variable to opentelemetry. Alternatively, you can configure it in the code with the following snippet:
9578

9679
```python
9780
from azure.core.settings import settings
@@ -101,12 +84,6 @@ You need to add following configuration settings as per your use case:
10184

10285
To learn more, see [Azure Core Tracing OpenTelemetry client library for Python](/python/api/overview/azure/core-tracing-opentelemetry-readme).
10386

104-
If you want to install Azure AI Inferencing package with support for OpenTelemetry based tracing, use the following command:
105-
106-
```bash
107-
pip install azure-ai-inference[opentelemetry]
108-
```
109-
11087
# [JavaScript](#tab/javascript)
11188

11289
Instrumentation is only supported for Chat Completion without streaming. To enable instrumentation, you need to register exporter(s). Following is an example of how to add a console exporter.
@@ -221,120 +198,7 @@ To configure OpenTelemetry and enable Azure AI Inference tracing follow these st
221198

222199
### Tracing your own functions
223200

224-
# [Python](#tab/python)
225-
226-
The `@tracer.start_as_current_span` decorator can be used to trace your own functions. This traces the function parameters and their values. You can also add further attributes to the span in the function implementation as demonstrated in the following example.
227-
228-
> [!NOTE]
229-
> You will have to set up the tracer in your code before using the decorator. To learn more. see [OpenTelemetry Python Documentation](https://opentelemetry.io/docs/languages/python/).
230-
231-
```python
232-
from opentelemetry.trace import get_tracer
233-
234-
tracer = get_tracer(__name__)
235-
236-
@tracer.start_as_current_span("get_temperature") # type: ignore
237-
238-
def get_temperature(city: str) -> str:
239-
240-
241-
242-
# Adding attributes to the current span
243-
244-
span = trace.get_current_span()
245-
246-
span.set_attribute("requested_city", city)
247-
248-
249-
250-
if city == "Seattle":
251-
252-
return "75"
253-
254-
elif city == "New York City":
255-
256-
return "80"
257-
258-
else:
259-
260-
return "Unavailable"
261-
262-
```
263-
264-
265-
# [JavaScript](#tab/javascript)
266-
267-
OpenTelemetry provides `startActiveSpan` to instrument your own code. Here's an example of how to use it:
268-
269-
```javascript
270-
271-
import { trace } from "@opentelemetry/api";
272-
273-
const tracer = trace.getTracer("sample", "0.1.0");
274-
275-
const getWeatherFunc = (location: string, unit: string): string => {
276-
277-
return tracer.startActiveSpan("getWeatherFunc", span => {
278-
279-
if (unit !== "celsius") {
280-
281-
unit = "fahrenheit";
282-
283-
}
284-
285-
const result = `The temperature in ${location} is 72 degrees ${unit}`;
286-
287-
span.setAttribute("result", result);
288-
289-
span.end();
290-
291-
return result;
292-
293-
});
294-
295-
}
296-
```
297-
298-
# [C#](#tab/csharp)
299-
300-
To trace your own functions, use the OpenTelemetry API to start and end spans around the code you want to trace. Here's an example:
301-
302-
```csharp
303-
using OpenTelemetry.Trace;
304-
305-
var tracer = Sdk.CreateTracerProviderBuilder()
306-
307-
.AddSource("sample")
308-
309-
.Build()
310-
311-
.GetTracer("sample");
312-
313-
using (var span = tracer.StartActiveSpan("getWeatherFunc"))
314-
315-
{
316-
var location = "Seattle";
317-
318-
var unit = "celsius";
319-
320-
if (unit != "celsius")
321-
322-
{
323-
unit = "fahrenheit";
324-
}
325-
326-
var result = $"The temperature in {location} is 72 degrees {unit}";
327-
328-
span.SetAttribute("result", result);
329-
330-
Console.WriteLine(result);
331-
332-
}
333-
```
334-
335-
To learn more, see [OpenTelemetry .NET](https://opentelemetry.io/docs/languages/net/).
336-
337-
---
201+
To trace your own custom functions, you can leverage OpenTelemetry, you'll need to instrument your code with the OpenTelemetry SDK. This involves setting up a tracer provider and creating spans around the code you want to trace. Each span represents a unit of work and can be nested to form a trace tree. You can add attributes to spans to enrich the trace data with additional context. Once instrumented, configure an exporter to send the trace data to a backend for analysis and visualization. For detailed instructions and advanced usage, refer to the [OpenTelemetry documentation](https://opentelemetry.io/docs/). This will help you monitor the performance of your custom functions and gain insights into their execution.
338202

339203
## Attach User feedback to traces
340204

@@ -344,7 +208,7 @@ To attach user feedback to traces and visualize them in AI Studio using OpenTele
344208

345209
# [Python](#tab/python)
346210

347-
- [Python samples]() containing fully runnable Python code for tracing using synchronous and asynchronous clients.
211+
- [Python samples](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
348212
- [Python samples to use Azure AI Project with tracing](https://github.com/Azure/azure-sdk-for-python/tree/feature/azure-ai-projects/sdk/ai/azure-ai-projects/samples/inference)
349213

350214
# [JavaScript](#tab/javascript)
@@ -355,8 +219,3 @@ To attach user feedback to traces and visualize them in AI Studio using OpenTele
355219
# [C#](#tab/csharp)
356220

357221
[C# Samples](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.Inference_1.0.0-beta.2/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md) containing fully runnable C# code for doing inference using synchronous and asynchronous methods.
358-
359-
---
360-
361-
- [Get started building a chat app using the prompt flow SDK](../../quickstarts/get-started-code.md)
362-
- [Work with projects in VS Code](vscode.md)

0 commit comments

Comments
 (0)