Skip to content

Commit 3cc2e8f

Browse files
Merge pull request #5028 from lgayhardt/tracefixes0525
Build: Tracing updates
2 parents 87bb855 + 9121b65 commit 3cc2e8f

File tree

1 file changed

+67
-52
lines changed

1 file changed

+67
-52
lines changed

articles/ai-foundry/how-to/develop/trace-application.md

Lines changed: 67 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -57,38 +57,37 @@ To view traces in Azure AI Foundry, you need to connect an Application Insights
5757
To trace the content of chat messages, set the `AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED` environment variable to true (case insensitive). Keep in mind this might contain personal data. To learn more, see [Azure Core Tracing OpenTelemetry client library for Python](/python/api/overview/azure/core-tracing-opentelemetry-readme).
5858

5959
```python
60-
from opentelemetry import trace
61-
from azure.monitor.opentelemetry import configure_azure_monitor
62-
from azure.ai.projects import AIProjectClient
63-
from azure.identity import DefaultAzureCredential
6460
import os
65-
6661
os.environ["AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED"] = "true" # False by default
62+
```
63+
Let's begin instrumenting our agent with OpenTelemetry tracing, by starting off with authenticating and connecting to your Azure AI Project using the `AIProjectClient`.
6764

65+
```python
66+
from azure.ai.projects import AIProjectClient
67+
from azure.identity import DefaultAzureCredential
6868
project_client = AIProjectClient.from_connection_string(
6969
credential=DefaultAzureCredential(),
70-
conn_str=os.environ["PROJECT_CONNECTION_STRING"],
70+
endpoint=os.environ["PROJECT_ENDPOINT"],
7171
)
7272
```
7373

74-
### Log to Azure Monitor Application Insights
75-
76-
Retrieve the connection string from the Application Insights resource connected to your project and set up the OTLP exporters to send telemetry into Azure Monitor.
74+
Next, retrieve the connection string from the Application Insights resource connected to your project and set up the OTLP exporters to send telemetry into Azure Monitor.
7775

7876
```python
77+
from azure.monitor.opentelemetry import configure_azure_monitor
7978
connection_string = project_client.telemetry.get_connection_string()
8079

8180
if not connection_string:
82-
print("Application Insights is not enabled. Enable by going to Observability > Traces in your AI Foundry project.")
81+
print("Application Insights is not enabled. Enable by going to Tracing in your Azure AI Foundry project.")
8382
exit()
8483

85-
configure_azure_monitor(connection_string=connection_string)
84+
configure_azure_monitor(connection_string=connection_string) #enable telemetry collection
8685
```
8786

88-
Start collecting telemetry and send to your project's connected Application Insights resource.
87+
Now, trace your code where you create and execute your agent and user message in your Azure AI Project, so you can see detailed steps for troubleshooting or monitoring.
8988

9089
```python
91-
# Start tracing
90+
from opentelemetry import trace
9291
tracer = trace.get_tracer(__name__)
9392

9493
with tracer.start_as_current_span("example-tracing"):
@@ -104,23 +103,59 @@ with tracer.start_as_current_span("example-tracing"):
104103
run = project_client.agents.create_run(thread_id=thread.id, agent_id=agent.id)
105104
```
106105

107-
### Log to a local OTLP endpoint
106+
After running your agent, you can go begin to [view traces in Azure AI Foundry Portal](#view-traces-in-azure-ai-foundry-portal).
107+
108+
### Log traces locally
108109

109-
To connect to Aspire Dashboard or another OpenTelemetry compatible backend, install the OpenTelemetry Protocol (OTLP) exporter. This enables you to print traces to the console or use a local viewer such as Aspire Dashboard.
110+
To connect to [Aspire Dashboard](https://aspiredashboard.com/#start) or another OpenTelemetry compatible backend, install the OpenTelemetry Protocol (OTLP) exporter. This enables you to print traces to the console or use a local viewer such as Aspire Dashboard.
110111

111112
```bash
112-
pip install opentelemetry-exporter-otlp
113+
pip install azure-core-tracing-opentelemetry opentelemetry-exporter-otlp opentelemetry-sdk
113114
```
115+
Next, you want to configure tracing for your application.
114116

115117
```python
116-
# Enable console tracing
117-
project_client.telemetry.enable(destination=sys.stdout)
118+
from azure.core.settings import settings
119+
settings.tracing_implementation = "opentelemetry"
118120

119-
# for local OTLP endpoint, change the destination to
120-
# project_client.telemetry.enable(destination="http://localhost:4317")
121+
from opentelemetry import trace
122+
from opentelemetry.sdk.trace import TracerProvider
123+
from opentelemetry.sdk.trace.export import SimpleSpanProcessor, ConsoleSpanExporter
124+
125+
# Setup tracing to console
126+
span_exporter = ConsoleSpanExporter()
127+
tracer_provider = TracerProvider()
128+
tracer_provider.add_span_processor(SimpleSpanProcessor(span_exporter))
129+
trace.set_tracer_provider(tracer_provider)
121130
```
131+
Use `enable_telemetry` to begin collecting telemetry.
122132

123-
### Trace custom functions
133+
```python
134+
from azure.ai.projects import enable_telemetry
135+
enable_telemetry(destination=sys.stdout)
136+
137+
# Logging to an OTLP endpoint, change the destination to
138+
# enable_telemetry(destination="http://localhost:4317")
139+
```
140+
```python
141+
# Start tracing
142+
from opentelemetry import trace
143+
tracer = trace.get_tracer(__name__)
144+
145+
with tracer.start_as_current_span("example-tracing"):
146+
agent = project_client.agents.create_agent(
147+
model=os.environ["MODEL_DEPLOYMENT_NAME"],
148+
name="my-assistant",
149+
instructions="You are a helpful assistant"
150+
)
151+
thread = project_client.agents.create_thread()
152+
message = project_client.agents.create_message(
153+
thread_id=thread.id, role="user", content="Tell me a joke"
154+
)
155+
run = project_client.agents.create_run(thread_id=thread.id, agent_id=agent.id)
156+
```
157+
158+
## Trace custom functions
124159

125160
To trace your custom functions, use the OpenTelemetry SDK to instrument your code.
126161

@@ -149,34 +184,26 @@ custom_function()
149184

150185
For detailed instructions and advanced usage, refer to the [OpenTelemetry documentation](https://opentelemetry.io/docs/).
151186

152-
### Attach user feedback to traces
187+
## Attach user feedback to traces
153188

154-
To attach user feedback to traces and visualize it in the Azure AI Foundry portal, you can instrument your application to enable tracing and log user feedback using OpenTelemetry's semantic conventions. By correlating feedback traces with their respective chat request traces using the response ID, you can view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
189+
To attach user feedback to traces and visualize it in the Azure AI Foundry portal, you can instrument your application to enable tracing and log user feedback using OpenTelemetry's semantic conventions.
155190

156-
To log user feedback, follow this format:
157191

158-
The user feedback evaluation event can be captured if and only if the user provided a reaction to the GenAI model response. It SHOULD, when possible, be parented to the GenAI span describing such response.
159192

160-
<!-- prettier-ignore-start -->
161-
<!-- markdownlint-capture -->
162-
<!-- markdownlint-disable -->
163-
The event name MUST be `gen_ai.evaluation.user_feedback`.
193+
By correlating feedback traces with their respective chat request traces using the response ID or thread ID, you can view and manage these traces in Azure AI Foundry portal. OpenTelemetry's specification allows for standardized and enriched trace data, which can be analyzed in Azure AI Foundry portal for performance optimization and user experience insights. This approach helps you use the full power of OpenTelemetry for enhanced observability in your applications.
194+
195+
To log user feedback, follow this format:
164196

165-
| Attribute | Type | Description | Examples | [Requirement Level](https://opentelemetry.io/docs/specs/semconv/general/attribute-requirement-level/) | Stability |
166-
|---|---|---|---|---|---|
167-
|`gen_ai.response.id`| string | The unique identifier for the completion. | `chatcmpl-123` | `Required` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
168-
| `gen_ai.evaluation.score`| double | Quantified score calculated based on the user reaction in [-1.0, 1.0] range with 0 representing a neutral reaction. | `0.42` | `Recommended` | ![Experimental](https://img.shields.io/badge/-experimental-blue) |
197+
The user feedback evaluation event can be captured if and only if the user provided a reaction to the GenAI model response. It SHOULD, when possible, be parented to the GenAI span describing such response.
169198

170-
<!-- markdownlint-restore -->
171-
<!-- prettier-ignore-end -->
172199

173200
The user feedback event body has the following structure:
174201

175202
| Body Field | Type | Description | Examples | Requirement Level |
176203
|---|---|---|---|---|
177204
| `comment` | string | Additional details about the user feedback | `"I did not like it"` | `Opt-in` |
178205

179-
### Using service name in trace data
206+
## Using service name in trace data
180207

181208
To identify your service via a unique ID in Application Insights, you can use the service name OpenTelemetry property in your trace data. This is useful if you're logging data from multiple applications to the same Application Insights resource, and you want to differentiate between them.
182209

@@ -190,7 +217,7 @@ To query trace data for a given service name, query for the `cloud_roleName` pro
190217
| where cloud_RoleName == "service_name"
191218
```
192219

193-
## Enable Tracing for Langchain
220+
## Enable tracing for Langchain
194221

195222
You can enable tracing for Langchain that follows OpenTelemetry standards as per [opentelemetry-instrumentation-langchain](https://pypi.org/project/opentelemetry-instrumentation-langchain/). To enable tracing for Langchain, install the package `opentelemetry-instrumentation-langchain` using your package manager, like pip:
196223

@@ -200,25 +227,13 @@ pip install opentelemetry-instrumentation-langchain
200227

201228
Once necessary packages are installed, you can easily begin to [Instrument tracing in your code](#instrument-tracing-in-your-code).
202229

203-
## Visualize your traces
204-
205-
### View your traces for local debugging
206-
207-
#### Prompty
208-
209-
Using Prompty, you can trace your application with **Open Telemetry**, which offers enhanced visibility and simplified troubleshooting for LLM-based applications. This method adheres to the OpenTelemetry specification, enabling the capture and visualization of an AI application's internal execution details, which improves debugging and enhances the development process. To learn more, see [Debugging Prompty](https://prompty.ai/docs/getting-started/debugging-prompty).
210-
211-
#### Aspire Dashboard
212-
213-
Aspire Dashboard is a free & open-source OpenTelemetry dashboard for deep insights into your apps on your local development machine. To learn more, see [Aspire Dashboard](https://aspiredashboard.com/#start).
214-
215-
### Debugging with traces in Azure AI Foundry portal
230+
## View traces in Azure AI Foundry portal
216231

217232
In your project, go to `Tracing` to filter your traces as you see fit.
218233

219-
By selecting a trace, I can step through each span and identify issues while observing how my application is responding.
234+
By selecting a trace, you can step through each span and identify issues while observing how your application is responding. This can help you debug and pinpoint issues in your application.
220235

221-
### View traces in Azure Monitor
236+
## View traces in Azure Monitor
222237

223238
If you logged traces using the previous code snippet, then you're all set to view your traces in Azure Monitor Application Insights. You can open in Application Insights from **Manage data source** and use the **End-to-end transaction details view** to further investigate.
224239

0 commit comments

Comments
 (0)