You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -28,7 +28,7 @@ In this article you'll learn how to trace your application with Azure AI Foundry
28
28
- If using Python, you need Python 3.8 or later installed, including pip.
29
29
- If using JavaScript, the supported environments are LTS versions of Node.js.
30
30
31
-
## Tracing using Azure AI Foundry Project
31
+
## Tracing using Azure AI Foundry Project Library
32
32
# [Python](#tab/python)
33
33
The best way to get started using the Azure AI Foundry SDK is by using a project. AI projects connect together different data, assets, and services you need to build AI applications. The AI project client allows you to easily access these project components from your code by using a single connection string. First follow steps to [create an AI Project](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/create-projects?tabs=ai-studio) if you don't have one already.
34
34
To enable tracing, first ensure your project has an attached Application Insights resource. Go to the **Tracing** page of your project in Azure AI Foundry portal and follow instructions to create or attach Application Insights. If one was enabled, you can get the Application Insights connection string, and observe the full execution path through Azure Monitor.
Refer the following samples to get started with tracing using Azure AI Project SDK:
45
-
-[Python Sample with console tracing](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
46
-
-[Python Sample with Azure Monitor](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
47
-
48
-
45
+
-[Python Sample with console tracing for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_ai_inference_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
46
+
-[Python Sample with Azure Monitor for Azure AI Inference Client](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
47
+
-[Python Sample with console tracing for Azure Open AI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
48
+
-[Python Sample with Azure Monitor for Azure Open AI](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
49
49
50
50
# [JavaScript](#tab/javascript)
51
51
@@ -57,21 +57,8 @@ Tracing is not yet integrated into the Azure AI Projects SDK for C#. For instruc
57
57
58
58
----
59
59
60
-
## Enable Tracing for Azure Open AI
61
-
The Azure OpenAI Service provides access to OpenAI's models including the GPT-4o, GPT-4o mini, GPT-4, GPT-4 Turbo with Vision, DALLE-3, Whisper, and Embeddings model series with the data residency, scalability, safety, security and enterprise capabilities of Azure. Refer this to learn more on how to get started with [Azure OpenAI Service](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/develop/sdk-overview?tabs=sync&pivots=programming-language-python#azure-openai-service)
62
-
63
-
### [Python](#tab/python)
64
-
Tracing in Azure Open AI follows Opentelemetry standards as per [opentelemetry-instrumentation-openai-v2 2.0b0](https://pypi.org/project/opentelemetry-instrumentation-openai-v2/) To enable tracing for Azure Open AI, follow following steps:
65
-
66
-
Install the package `opentelemetry-instrumentation-openai-v2 2.0b0` using your package manager, like pip:
Refer the following samples to get started with tracing using Azure AI Project SDK:
73
-
-[Python Sample with console tracing](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_console_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
74
-
-[Python Sample with Azure Monitor](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-projects/samples/inference/sample_chat_completions_with_azure_openai_client_and_azure_monitor_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
60
+
\
61
+
Refer the following samples to get started with tracing using Azure AI Project Library for Azure OpenAI:
75
62
76
63
77
64
### [JavaScript](#tab/javascript)
@@ -82,7 +69,7 @@ Currently this is supported in Python only.
82
69
83
70
----
84
71
85
-
## Enable Tracing using Azure AI Inference SDK
72
+
## Enable Tracing using Azure AI Inference Library
86
73
87
74
### Installation
88
75
@@ -135,14 +122,6 @@ To learn more , see the [Inference SDK reference](../../reference/reference-mode
135
122
You need to add following configuration settings as per your use case:
136
123
137
124
- To capture prompt and completion contents, set the `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment variable to true (case insensitive). By default, prompts, completions, function names, parameters, or outputs aren't recorded.
138
-
- To enable Azure SDK tracing, set the `AZURE_SDK_TRACING_IMPLEMENTATION` environment variable to opentelemetry. Alternatively, you can configure it in the code with the following snippet:
139
-
140
-
```python
141
-
from azure.core.settings import settings
142
-
143
-
settings.tracing_implementation ="opentelemetry"
144
-
```
145
-
146
125
To learn more, see [Azure Core Tracing OpenTelemetry client library for Python](/python/api/overview/azure/core-tracing-opentelemetry-readme).
147
126
148
127
# [JavaScript](#tab/javascript)
@@ -324,5 +303,7 @@ The user feedback event body has the following structure:
324
303
## Related content
325
304
326
305
- [Pythonsamples](https://github.com/Azure/azure-sdk-for-python/blob/main/sdk/ai/azure-ai-inference/samples/sample_chat_completions_with_tracing.py) containing fully runnable Python code for tracing using synchronous and asynchronous clients.
- [JavaScriptsamples](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-inference-rest/samples/v1-beta/typescript/src) containing fully runnable JavaScript code for tracing using synchronous and asynchronous clients.
328
309
- [C# Samples](https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.Inference_1.0.0-beta.2/sdk/ai/Azure.AI.Inference/samples/Sample8_ChatCompletionsWithOpenTelemetry.md) containing fully runnable C# code for doing inference using synchronous and asynchronous methods.
0 commit comments