Skip to content

Commit e853525

Browse files
committed
Samples for tracing
1 parent c8c4f33 commit e853525

File tree

12 files changed

+918
-0
lines changed

12 files changed

+918
-0
lines changed
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
---
2+
page_type: sample
3+
languages:
4+
- python
5+
products:
6+
- ai-services
7+
- azure-openai
8+
description: Pure LangChain weather assistant with Azure tracing and manual tool-calling loop.
9+
---
10+
11+
## Pure LangChain Weather (Tracing)
12+
13+
### Overview
14+
15+
This sample demonstrates a pure LangChain agent that uses a manual tool-calling loop, instrumented with Azure Application Insights via `langchain-azure-ai`. It calls Azure OpenAI chat models and a simple `get_weather` tool, and emits OpenTelemetry traces locally or to Azure Monitor.
16+
17+
### Objective
18+
19+
- Use `langchain` and `langchain-openai` with Azure OpenAI chat models.
20+
- Add tracing via `langchain-azure-ai` and OpenTelemetry.
21+
- Implement a manual tool-calling loop for clarity and control.
22+
23+
### Programming Languages
24+
25+
- Python
26+
27+
### Estimated Runtime: 10 mins
28+
29+
## Set up
30+
31+
Create and activate a local virtual environment, then install dependencies:
32+
33+
```
34+
python -m venv .venv
35+
source .venv/bin/activate # Windows: .venv\\Scripts\\activate
36+
pip install -r requirements.txt
37+
```
38+
39+
Copy the environment template and set required variables:
40+
41+
```
42+
cp .env.sample .env
43+
```
44+
45+
Required:
46+
47+
- `AZURE_OPENAI_API_KEY`
48+
- `AZURE_OPENAI_ENDPOINT`
49+
- `AZURE_OPENAI_DEPLOYMENT`
50+
- `AZURE_OPENAI_API_VERSION` (default `2024-02-15-preview`)
51+
52+
Optional for tracing:
53+
54+
- `APPLICATION_INSIGHTS_CONNECTION_STRING`
55+
56+
## Run
57+
58+
```
59+
python weather.py
60+
```
61+
62+
You can send traces to an OTLP-compatible backend by setting `OTEL_EXPORTER_OTLP_ENDPOINT`, or to Azure Monitor by setting `APPLICATION_INSIGHTS_CONNECTION_STRING`.
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# Full dev setup (optional)
2+
langchain
3+
langchain-openai
4+
langchain-azure-ai
5+
opentelemetry-api
6+
opentelemetry-sdk
7+
opentelemetry-exporter-otlp
8+
python-dotenv
9+
azure-monitor-opentelemetry
Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
langchain
2+
langchain-openai
3+
langchain-azure-ai
4+
opentelemetry-api
5+
opentelemetry-sdk
6+
opentelemetry-exporter-otlp
7+
python-dotenv
Lines changed: 217 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,217 @@
1+
"""
2+
LangChain Weather Assistant with manual tool-calling loop + Azure tracing.
3+
4+
Env vars required:
5+
AZURE_OPENAI_API_KEY=...
6+
AZURE_OPENAI_ENDPOINT=https://YOUR-RESOURCE.openai.azure.com
7+
AZURE_OPENAI_DEPLOYMENT=yourDeploymentName
8+
AZURE_OPENAI_API_VERSION=2024-02-15-preview (or compatible)
9+
10+
Optional tracing:
11+
APPLICATION_INSIGHTS_CONNECTION_STRING=InstrumentationKey=...;IngestionEndpoint=...
12+
13+
Run:
14+
python weather.py
15+
"""
16+
17+
import os
18+
import json
19+
import logging
20+
from datetime import datetime
21+
from typing import List, Any, Optional, Dict
22+
23+
from langchain_core.tools import tool
24+
from langchain_core.messages import (
25+
SystemMessage,
26+
HumanMessage,
27+
AIMessage,
28+
ToolMessage,
29+
BaseMessage,
30+
)
31+
from langchain_openai import AzureChatOpenAI
32+
33+
try:
34+
from langchain_azure_ai.callbacks.tracers import AzureAIInferenceTracer
35+
except ImportError:
36+
AzureAIInferenceTracer = None
37+
38+
logging.basicConfig(level=logging.INFO)
39+
logger = logging.getLogger("langchain_weather")
40+
41+
42+
# -----------------------------------------------------------------------------
43+
# Tracing Setup (cached)
44+
# -----------------------------------------------------------------------------
45+
_TRACERS: Optional[List[Any]] = None
46+
47+
48+
def setup_tracing() -> List[Any]:
49+
global _TRACERS
50+
if _TRACERS is not None:
51+
return _TRACERS
52+
tracers: List[Any] = []
53+
conn = os.getenv("APPLICATION_INSIGHTS_CONNECTION_STRING")
54+
if conn and AzureAIInferenceTracer:
55+
try:
56+
tracer = AzureAIInferenceTracer(
57+
connection_string=conn,
58+
enable_content_recording=True,
59+
name="langchain_weather",
60+
id="weather_agent",
61+
endpoint="weather_single",
62+
scope="Pure LangChain Weather",
63+
)
64+
tracers.append(tracer)
65+
logger.info("Azure tracing enabled.")
66+
except Exception as e:
67+
logger.warning(f"Failed to init tracer: {e}")
68+
else:
69+
logger.info("Tracing not enabled (missing APPLICATION_INSIGHTS_CONNECTION_STRING or dependency).")
70+
_TRACERS = tracers
71+
return tracers
72+
73+
74+
def trace_config(agent_name: str, session_id: str) -> Dict[str, Any]:
75+
tracers = setup_tracing()
76+
return {
77+
"callbacks": tracers,
78+
"tags": [f"agent:{agent_name}", agent_name, "weather-langchain"],
79+
"metadata": {
80+
"agent_name": agent_name,
81+
"agent_type": agent_name,
82+
"langgraph_node": agent_name, # kept for parity
83+
"session_id": session_id,
84+
"thread_id": session_id,
85+
"system": "langchain-weather",
86+
},
87+
}
88+
89+
90+
# -----------------------------------------------------------------------------
91+
# Tool
92+
# -----------------------------------------------------------------------------
93+
@tool
94+
def get_weather(location: str, date: Optional[str] = None) -> str:
95+
"""
96+
Return a mock weather forecast as JSON.
97+
"""
98+
if not date:
99+
date = datetime.utcnow().strftime("%Y-%m-%d")
100+
seed = sum(ord(c) for c in location.lower()) % 5
101+
conditions = ["Sunny", "Partly Cloudy", "Light Rain", "Overcast", "Showers"]
102+
cond = conditions[seed]
103+
forecast = {
104+
"location": location,
105+
"date": date,
106+
"condition": cond,
107+
"temp_high_c": 24 + seed,
108+
"temp_low_c": 14 + seed,
109+
"advice": "Great day outside!" if cond == "Sunny" else "Plan for changing conditions.",
110+
}
111+
return json.dumps(forecast, indent=2)
112+
113+
114+
TOOLS = [get_weather]
115+
TOOLS_BY_NAME = {t.name: t for t in TOOLS}
116+
117+
118+
# -----------------------------------------------------------------------------
119+
# LLM Factory
120+
# -----------------------------------------------------------------------------
121+
def build_llm(session_id: str) -> AzureChatOpenAI:
122+
required = [
123+
"AZURE_OPENAI_API_KEY",
124+
"AZURE_OPENAI_ENDPOINT",
125+
"AZURE_OPENAI_DEPLOYMENT",
126+
]
127+
missing = [v for v in required if not os.getenv(v)]
128+
if missing:
129+
raise RuntimeError(f"Missing Azure OpenAI env vars: {', '.join(missing)}")
130+
return AzureChatOpenAI(
131+
api_key=os.environ["AZURE_OPENAI_API_KEY"],
132+
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
133+
azure_deployment=os.environ["AZURE_OPENAI_DEPLOYMENT"],
134+
api_version=os.getenv("AZURE_OPENAI_API_VERSION", "2024-02-15-preview"),
135+
temperature=0.2,
136+
callbacks=setup_tracing(),
137+
tags=["weather_agent", "weather-langchain"],
138+
metadata={
139+
"agent_type": "weather_agent",
140+
"agent_name": "weather_agent",
141+
"system": "langchain-weather",
142+
"session_id": session_id,
143+
"thread_id": session_id,
144+
},
145+
)
146+
147+
148+
SYSTEM_PROMPT = """You are a weather assistant.
149+
If user asks about weather, call the get_weather tool with (location, date if given).
150+
If ambiguous date, assume tomorrow.
151+
After tool output, summarize succinctly for the user.
152+
"""
153+
154+
155+
# -----------------------------------------------------------------------------
156+
# Agent Loop (manual)
157+
# -----------------------------------------------------------------------------
158+
def run_weather_conversation(user_query: str, session_id: str) -> str:
159+
llm = build_llm(session_id)
160+
# Bind tools for tool-calling (function-calling) capability
161+
tool_llm = llm.bind_tools(TOOLS)
162+
163+
messages: List[BaseMessage] = [
164+
SystemMessage(content=SYSTEM_PROMPT),
165+
HumanMessage(content=user_query),
166+
]
167+
168+
# We allow up to N reasoning/tool steps (simple guard)
169+
for step in range(5):
170+
logger.info(f"LLM step {step + 1}")
171+
response: AIMessage = tool_llm.invoke(messages, config=trace_config("weather_agent", session_id))
172+
messages.append(response)
173+
174+
# If the model decided not to call any tools, we stop
175+
tool_calls = getattr(response, "tool_calls", None)
176+
if not tool_calls:
177+
logger.info("No tool calls; finishing.")
178+
break
179+
180+
# Execute each tool call and append ToolMessage
181+
for tc in tool_calls:
182+
name = tc["name"]
183+
args = tc.get("args", {})
184+
tool_obj = TOOLS_BY_NAME.get(name)
185+
if not tool_obj:
186+
tool_output = f"Tool '{name}' not found."
187+
else:
188+
try:
189+
tool_output = tool_obj.invoke(args)
190+
except Exception as e:
191+
tool_output = f"Error executing tool '{name}': {e}"
192+
messages.append(
193+
ToolMessage(
194+
content=tool_output,
195+
name=name,
196+
tool_call_id=tc["id"],
197+
)
198+
)
199+
200+
# Final answer: last AI message with no tool calls OR last AI message overall
201+
final_ai = next((m for m in reversed(messages) if isinstance(m, AIMessage)), None)
202+
return final_ai.content if final_ai else "No AI response."
203+
204+
205+
def main() -> None:
206+
print("Pure LangChain Weather (Instrumented)")
207+
q = input("Ask a weather question (e.g. 'Weather in Tokyo tomorrow'): ").strip()
208+
if not q:
209+
q = "Weather in Paris"
210+
session_id = f"lc-session-{datetime.utcnow().strftime('%Y%m%d%H%M%S')}"
211+
answer = run_weather_conversation(q, session_id)
212+
print("\n--- Answer ---")
213+
print(answer)
214+
215+
216+
if __name__ == "__main__":
217+
main()
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
---
2+
page_type: sample
3+
languages:
4+
- python
5+
products:
6+
- ai-services
7+
- azure-openai
8+
description: Pure LangGraph weather workflow with Azure tracing and single-node tool execution.
9+
---
10+
11+
## Pure LangGraph Weather (Tracing)
12+
13+
### Overview
14+
15+
This sample demonstrates a LangGraph single-node workflow that calls Azure OpenAI chat models and a `get_weather` tool, with traces exported via `langchain-azure-ai` to local OTLP endpoints or Azure Monitor.
16+
17+
### Objective
18+
19+
- Use `langgraph` with `langchain` and Azure OpenAI.
20+
- Instrument with `langchain-azure-ai` and OpenTelemetry for tracing.
21+
- Stream steps and inspect final state in a simple weather flow.
22+
23+
### Programming Languages
24+
25+
- Python
26+
27+
### Estimated Runtime: 10 mins
28+
29+
## Set up
30+
31+
Create and activate a local virtual environment, then install dependencies:
32+
33+
```
34+
python -m venv .venv
35+
source .venv/bin/activate # Windows: .venv\\Scripts\\activate
36+
pip install -r requirements.txt
37+
```
38+
39+
Copy the environment template and set required variables:
40+
41+
```
42+
cp .env.sample .env
43+
```
44+
45+
Required:
46+
47+
- `AZURE_OPENAI_API_KEY`
48+
- `AZURE_OPENAI_ENDPOINT`
49+
- `AZURE_OPENAI_DEPLOYMENT`
50+
- `AZURE_OPENAI_API_VERSION` (optional; defaults to `2024-02-15-preview`)
51+
52+
Optional for tracing:
53+
54+
- `APPLICATION_INSIGHTS_CONNECTION_STRING`
55+
56+
## Run
57+
58+
```
59+
python weather.py
60+
```
61+
62+
Optionally set `OTEL_EXPORTER_OTLP_ENDPOINT` for local OTLP backends, or `APPLICATION_INSIGHTS_CONNECTION_STRING` to send traces to Azure Monitor.
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Full dev setup (optional)
2+
langchain
3+
langchain-openai
4+
langchain-azure-ai
5+
langgraph
6+
opentelemetry-api
7+
opentelemetry-sdk
8+
opentelemetry-exporter-otlp
9+
python-dotenv
10+
azure-monitor-opentelemetry
Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
langchain
2+
langchain-openai
3+
langchain-azure-ai
4+
langgraph
5+
opentelemetry-api
6+
opentelemetry-sdk
7+
opentelemetry-exporter-otlp
8+
python-dotenv

0 commit comments

Comments
 (0)