Skip to content

Commit 5417e54

Browse files
authored
Merge pull request #1356 from lgayhardt/release-ignite-ai-studio
[Azure AI Svcs] AI Studio: Visualize your traces
2 parents 825a85b + ae24c35 commit 5417e54

File tree

6 files changed

+187
-2
lines changed

6 files changed

+187
-2
lines changed

articles/ai-studio/concepts/trace.md

Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
---
2+
title: Tracing in Azure AI Inference SDK
3+
titleSuffix: Azure AI Studio
4+
description: This article provides an overview of tracing with the Azure AI Inference SDK.
5+
manager: scottpolly
6+
ms.service: azure-ai-studio
7+
ms.topic: conceptual
8+
ms.date: 11/19/2024
9+
ms.reviewer: truptiparkar
10+
ms.author: lagayhar
11+
author: lgayhardt
12+
---
13+
14+
# Tracing in Azure AI Inference SDK overview
15+
16+
[!INCLUDE [feature-preview](../includes/feature-preview.md)]
17+
18+
Tracing is a powerful tool that offers developers an in-depth understanding of the execution process of their generative AI applications. It provides a detailed view of the execution flow of the application. This essential information proves critical while debugging complex applications or optimizing performance.
19+
20+
Tracing with the Azure AI Inference SDK offers enhanced visibility and simplified troubleshooting for LLM-based applications, effectively supporting development, iteration, and production monitoring. Tracing follows the OpenTelemetry semantic conventions, capturing and visualizing the internal execution details of any AI application, enhancing the overall development experience.
21+
22+
## Key features
23+
24+
- **Enhanced Observability**: Offers clear insights into the Gen AI Application lifecycle.
25+
- **User-Centric Design**: Simplifies telemetry enablement, focusing on improving developer workflow and productivity.
26+
- **Seamless Instrumentation**: Seamlessly instruments Azure AI Inference API for enabling telemetry.
27+
- **OTEL based tracing for User-defined functions**: Allows adding local variables and intermediate results to trace decorator for detailed tracing capabilities for user defined functions.
28+
- **Secure Data Handling**: Provides options to prevent sensitive or large data logging as per open telemetry standards.
29+
- **Feedback Logging**: Users can collect & attach user feedback and evaluative data to enrich trace data with qualitative insights.
30+
31+
## Concepts
32+
33+
### Traces
34+
35+
Traces record specific events or the state of an application during execution. It can include data about function calls, variable values, system events and more. Whether your application is a monolith with a single database or a sophisticated mesh of services, traces are essential to understanding the full "path" a request takes in your application. To learn more, see [OpenTelemetry Traces](https://opentelemetry.io/docs/concepts/signals/traces/).
36+
37+
### Semantic conventions
38+
39+
OpenTelemetry defines Semantic Conventions, sometimes called Semantic attributes, that specify common names for different kinds of operations and data. The benefit of using Semantic conventions is in following a common naming scheme that can be standardized across a codebase, libraries, and platforms. By adhering to these conventions, Azure AI ensures that trace data is consistent and can be easily interpreted by observability tools. This consistency is crucial for effective monitoring, debugging, and optimization of Gen AI applications. To learn more, see [OpenTelemetry's Semantic Conventions for Generative AI systems](https://opentelemetry.io/docs/specs/semconv/gen-ai/).
40+
41+
### Spans
42+
43+
Spans are the building blocks of traces. Each span represents a single operation within a trace, capturing the start and end time, and any attributes or metadata associated with the operation. Spans can be nested to represent hierarchical relationships, allowing developers to see the full call stack and understand the sequence of operations. To learn more, see [OpenTelemetry's Spans](https://opentelemetry.io/docs/concepts/signals/traces/#spans).
44+
45+
### Attributes
46+
47+
Attributes are key-value pairs that provide additional information about a trace or span. Attributes can be used to record contextual data such as function parameters, return values, or custom annotations. This metadata enriches the trace data, making it more informative and useful for analysis.
48+
49+
Attributes have the following rules that each language SDK implements:
50+
51+
- Keys must be non-null string values.
52+
- Values must be a non-null string, boolean, floating point value, integer, or an array of these values.
53+
54+
To learn more, see [OpenTelemetry's Attributes](https://opentelemetry.io/docs/concepts/signals/traces/#attributes).
55+
56+
### Trace exporters
57+
58+
Trace exporters are responsible for sending trace data to a backend system for storage and analysis. Azure AI supports exporting traces to various observability platforms, including Azure Monitor and other OpenTelemetry-compatible backends.
59+
60+
### Trace visualization
61+
62+
Trace visualization refers to the graphical representation of trace data. Azure AI integrates with visualization tools like Azure AI Studio Tracing, Aspire dashboard, and Prompty Trace viewer to provide developers with an intuitive way to explore and analyze traces, helping them to quickly identify issues and understand the behavior of their applications.
63+
64+
## Conclusion
65+
66+
Azure AI's tracing capabilities are designed to empower developers with the tools they need to gain deep insights into their AI applications. By providing a robust, intuitive, and scalable tracing feature, Azure AI helps reduce debugging time, enhance application reliability, and improve overall performance. With a focus on user experience and system observability, Azure AI's tracing solution is set to revolutionize the way developers interact with and understand their Gen AI applications.
67+
68+
## Related content
69+
70+
- [Trace your application with Azure AI Inference SDK](../how-to/develop/trace-local-sdk.md)
71+
- [Visualize your traces](../how-to/develop/visualize-traces.md)
Lines changed: 108 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,108 @@
1+
---
2+
title: Visualize your traces
3+
titleSuffix: Azure AI Studio
4+
description: This article provides instructions on how to visualize your traces.
5+
manager: scottpolly
6+
ms.service: azure-ai-studio
7+
ms.topic: how-to
8+
ms.date: 11/19/2024
9+
ms.reviewer: amipatel
10+
ms.author: lagayhar
11+
author: lgayhardt
12+
---
13+
14+
# Visualize your traces
15+
16+
[!INCLUDE [feature-preview](../../includes/feature-preview.md)]
17+
18+
After instrumenting your application to log traces, let's walkthrough how you can view your traces in both local and cloud solutions to debug your application.
19+
20+
## View your traces for local debugging
21+
22+
To enable traces locally, you have two options:
23+
24+
1. Using **Prompty**, you can trace your application with the **Azure AI Inference SDK**, which offers enhanced visibility and simplified troubleshooting for LLM-based applications. This method follows the OpenTelemetry specification, capturing and visualizing the internal execution details of any AI application, thereby enhancing the overall development experience. To learn more, see [Debugging Prompty](https://prompty.ai/docs/getting-started/debugging-prompty).
25+
2. **Aspire Dashboard** : A free & open-source OpenTelemetry dashboard for deep insights into your apps on your local development machine. To learn more, see [Aspire Dashboard](https://aspiredashboard.com/#start ).
26+
27+
## View your traces in Azure AI Foundry portal
28+
29+
Before you can log to Azure AI Foundry portal, attach an Application Insights resource to your project.
30+
31+
1. Navigate to your project in [Azure AI Foundry portal](https://ai.azure.com/).
32+
1. Select the **Tracing** page on the left hand side.
33+
1. Select **Create New** to attach a new Application Insights resource to your project.
34+
1. Supply a name and select **Create**.
35+
36+
:::image type="content" source="../../media/trace/visualize/tracing-setup-overview.gif" alt-text="Animation of going to tracing and creating an Application Insight resource." lightbox="../../media/trace/visualize/tracing-setup-overview.gif":::
37+
38+
Next, install the `opentelemetry` SDK:
39+
40+
```python
41+
%pip install azure-monitor-opentelemetry
42+
```
43+
44+
Now enable tracing with output to the console:
45+
46+
```python
47+
import os
48+
from azure.monitor.opentelemetry import configure_azure_monitor
49+
50+
os.environ['AZURE_TRACING_GEN_AI_CONTENT_RECORDING_ENABLED'] = 'true'
51+
# Enable Azure Monitor tracing
52+
application_insights_connection_string = project.telemetry.get_connection_string()
53+
if not application_insights_connection_string:
54+
print("Application Insights was not enabled for this project.")
55+
print("Enable it via the 'Tracing' tab in your AI Studio project page.")
56+
exit()
57+
58+
configure_azure_monitor(connection_string=application_insights_connection_string)
59+
```
60+
61+
Finally, run an inferencing call. The call is logged to Azure AI Studio. This code prints a link to the traces.
62+
63+
```python
64+
response = chat.complete(
65+
model="gpt-4o-mini",
66+
messages=[
67+
{"role": "system", "content": "You are an AI assistant that is a travel planning expert especially with National Parks."},
68+
{"role": "user", "content": "Hey, can you recommend me trails I should go on when I visit Mount Rainier?"},
69+
]
70+
)
71+
72+
print("View traces at:")
73+
print(f"https://ai.azure.com/tracing?wsid=/subscriptions/{project.scope['subscription_id']}/resourceGroups/{project.scope['resource_group_name']}/providers/Microsoft.MachineLearningServices/workspaces/{project.scope['project_name']}")
74+
```
75+
76+
Select the link and begin viewing traces in Azure AI Studio!
77+
78+
### Debug and filter traces
79+
80+
In your project, you can filter your traces as you see fit.
81+
82+
By selecting a trace, I can step through each span and identify issues while observing how my application is responding.
83+
84+
:::image type="content" source="../../media/trace/visualize/debug-filter-tracing.gif" alt-text="Animation of filtering traces in the portal." lightbox="../../media/trace/visualize/debug-filter-tracing.gif":::
85+
86+
### Update your attached Application Insights resource
87+
88+
To update the Application Insights resource that is attached to your project, go to **Manage data source** and **Edit** to switch to a new Application Insights resource.
89+
90+
:::image type="content" source="../../media/trace/visualize/tracing-manage-data-source.png" alt-text="Screenshot of manage data sources pop-up highlighting the edit button." lightbox="../../media/trace/visualize/tracing-manage-data-source.png":::
91+
92+
## View your traces in Azure Monitor
93+
94+
If you logged traces using the previous code snippet, then you're all set to view your traces in Azure Monitor Application Insights. You can open in Application Insights from **Manage data source** and use the **End-to-end transaction details view** to further investigate.
95+
96+
For more information on how to send Azure AI Inference traces to Azure Monitor and create Azure Monitor resource, see [Azure Monitor OpenTelemetry documentation](/azure/azure-monitor/app/opentelemetry-enable).
97+
98+
### View your generative AI spans and traces
99+
100+
From Azure AI studio project, you can also open your custom dashboard that provides you with insights specifically to help you monitor your generative AI application.
101+
102+
In this Azure Workbook, you can view your Gen AI spans and jump into the Azure Monitor **End-to-end transaction details view** view to deep dive and investigate.
103+
104+
Learn more about using this workbook to monitor your application, see [Azure Workbook documentation](/azure/azure-monitor/visualize/workbooks-create-workbook).
105+
106+
## Related content
107+
108+
- [Trace your application with Azure AI Inference SDK](./trace-local-sdk.md)
19.5 MB
Loading
185 KB
Loading
12.5 MB
Loading

articles/ai-studio/toc.yml

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -305,9 +305,15 @@ items:
305305
href: how-to/develop/langchain.md
306306
- name: Develop with LlamaIndex
307307
href: how-to/develop/llama-index.md
308-
- name: Trace your application with prompt flow
309-
href: how-to/develop/trace-local-sdk.md
310308
displayName: code,sdk
309+
- name: Trace generative AI apps
310+
items:
311+
- name: Tracing overview
312+
href: concepts/trace.md
313+
- name: Trace your application with Azure AI Inference SDK
314+
href: how-to/develop/trace-local-sdk.md
315+
- name: Visualize your traces
316+
href: how-to/develop/visualize-traces.md
311317
- name: Evaluate generative AI apps
312318
items:
313319
- name: Evaluations concepts

0 commit comments

Comments
 (0)