You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: sdk/ai/azure-ai-inference/CHANGELOG.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,6 +4,8 @@
4
4
5
5
### Features Added
6
6
7
+
* Support for tracing. Please find more information in the package [README.md](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/README.md).
Copy file name to clipboardExpand all lines: sdk/ai/azure-ai-inference/README.md
+89Lines changed: 89 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -57,6 +57,14 @@ To update an existing installation of the package, use:
57
57
pip install --upgrade azure-ai-inference
58
58
```
59
59
60
+
If you want to install Azure AI Inferencing package with support for OpenTelemetry based tracing, use the following command:
61
+
62
+
```bash
63
+
pip install azure-ai-inference[trace]
64
+
```
65
+
66
+
67
+
60
68
## Key concepts
61
69
62
70
### Create and authenticate a client directly, using API key or GitHub token
@@ -530,6 +538,87 @@ For more information, see [Configure logging in the Azure libraries for Python](
530
538
531
539
To report issues with the client library, or request additional features, please open a GitHub issue [here](https://github.com/Azure/azure-sdk-for-python/issues)
532
540
541
+
## Tracing
542
+
543
+
The Azure AI Inferencing API Tracing library provides tracing for Azure AI Inference client library for Python. Refer to Installation chapter above for installation instructions.
544
+
545
+
### Setup
546
+
547
+
The environment variable AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED controls whether the actual message contents will be recorded in the traces ornot. By default, the message contents are not recorded as part of the trace. When message content recording is disabled any function call tool related function names, function parameter names and function parameter values are also not recorded in the trace. Set the value of the environment variable to "true" (case insensitive) for the message contents to be recorded as part of the trace. Any other value will cause the message contents not to be recorded.
548
+
549
+
You also need to configure the tracing implementation in your code by setting `AZURE_SDK_TRACING_IMPLEMENTATION` to `opentelemetry`or configuring it in the code with the following snippet:
Please refer to [azure-core-tracing-documentation](https://learn.microsoft.com/python/api/overview/azure/core-tracing-opentelemetry-readme) for more information.
561
+
562
+
### Exporting Traces with OpenTelemetry
563
+
564
+
Azure AI Inference is instrumented with OpenTelemetry. In order to enable tracing you need to configure OpenTelemetry to export traces to your observability backend.
565
+
Refer to [Azure SDK tracing in Python](https://learn.microsoft.com/python/api/overview/azure/core-tracing-opentelemetry-readme?view=azure-python-preview) for more details.
566
+
567
+
Refer to [Azure Monitor OpenTelemetry documentation](https://learn.microsoft.com/azure/azure-monitor/app/opentelemetry-enable?tabs=python) for the details on how to send Azure AI Inference traces to Azure Monitor and create Azure Monitor resource.
568
+
569
+
### Instrumentation
570
+
571
+
Use the AIInferenceInstrumentor to instrument the Azure AI Inferencing APIforLLM tracing, this will cause the LLM traces to be emitted from Azure AI Inferencing API.
from azure.ai.inference.tracing import AIInferenceInstrumentor
577
+
# Instrument AI Inference API
578
+
AIInferenceInstrumentor().instrument()
579
+
```
580
+
581
+
<!--ENDSNIPPET-->
582
+
583
+
584
+
It is also possible to uninstrument the Azure AI Inferencing API by using the uninstrument call. After this call, the traces will no longer be emitted by the Azure AI Inferencing API until instrument is called again.
The @tracer.start_as_current_span decorator can be used to trace your own functions. This will trace the function parameters and their values. You can also add further attributes to the span in the function implementation as demonstrated below. Note that you will have to setup the tracer in your code before using the decorator. More information is available [here](https://opentelemetry.io/docs/languages/python/).
* Have a look at the [Samples](https://github.com/Azure/azure-sdk-for-python/tree/main/sdk/ai/azure-ai-inference/samples) folder, containing fully runnable Python code for doing inference using synchronous and asynchronous clients.
0 commit comments