Skip to content

Commit 53d0878

Browse files
ds-sebastianchwilczynskiakotylamhordynski
authored
docs: Agents Tutorial (#688)
Co-authored-by: alicja <[email protected]> Co-authored-by: Mateusz Hordyński <[email protected]> Co-authored-by: Mateusz Hordyński <[email protected]>
1 parent 70c562d commit 53d0878

File tree

7 files changed

+470
-20
lines changed

7 files changed

+470
-20
lines changed

docs/tutorials/agents.md

Lines changed: 260 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,260 @@
1+
# Tutorial: Multi-Agent System with A2A and MCP
2+
3+
Let's build a multi-agent system for automated trip planning with Ragbits. In this tutorial, we'll create:
4+
5+
1. A Flight Finder Agent that searches for available flights
6+
2. A City Explorer Agent that gathers information about destinations
7+
3. An Orchestrator Agent that coordinates both agents to create comprehensive trip plans
8+
9+
**What you'll learn:**
10+
11+
- How to create specialized agents with [tools/function calling](https://platform.openai.com/docs/guides/function-calling?api-mode=responses)
12+
- How to use the [Model Context Protocol (MCP)](https://modelcontextprotocol.io/introduction) for external data
13+
- How to expose agents through [Agent-to-Agent (A2A)](https://github.com/a2aproject/A2A) protocol
14+
- How to build an orchestrator that manages conversation context
15+
16+
## Configuring the environment
17+
18+
Install the latest Ragbits via `pip install -U ragbits[a2a,mcp]` to follow along.
19+
20+
During development, we will use OpenAI's `gpt-4.1` model. To authenticate, Ragbits will look into your `OPENAI_API_KEY`. You can easily swap this with [other providers](../how-to/llms/use_llms.md).
21+
22+
!!! tip "Recommended: Set up OpenTelemetry tracing to understand what's happening under the hood."
23+
OpenTelemetry is an LLMOps tool that natively integrates with Ragbits and offer explainability and experiment tracking. In this tutorial, you can use OpenTelemetry to visualize prompts and optimization progress as traces to understand the Ragbits' behavior better. Check the full setup guide [here](../how-to/audit/use_tracing.md/#using-opentelemetry-tracer).
24+
25+
## Building the Flight Finder Agent
26+
27+
We start by defining the [prompt](../how-to/prompts/use_prompting.md) that will lead this agent.
28+
29+
```py title="flight_agent.py"
30+
from pydantic import BaseModel
31+
from ragbits.core.prompt import Prompt
32+
33+
--8<-- "examples/agents/a2a/flight_agent.py:37:53"
34+
35+
print(FlightPrompt(FlightPromptInput(input="I need to fly from New York to Paris. What flights are available?")).chat)
36+
```
37+
38+
```json
39+
[{'role': 'system', 'content': 'You are a helpful travel assistant that finds available flights between two cities.'},
40+
{'role': 'user', 'content': 'I need to fly from New York to Paris. What flights are available?'}]
41+
```
42+
43+
Next, we [define a tool](../how-to/llms/use_tools_with_llms.md) that will provide flight information. **Note**: in a real application, you'd connect to the actual flight APIs:
44+
45+
```py title="flight_agent.py"
46+
import json
47+
48+
--8<-- "examples/agents/a2a/flight_agent.py:11:34"
49+
```
50+
51+
This agent will call this function as needed, and the results will be injected back to the conversation.
52+
53+
Now let's create the agent and test it:
54+
55+
```py title="flight_agent.py"
56+
from ragbits.agents import Agent
57+
from ragbits.core.llms import LiteLLM
58+
59+
--8<-- "examples/agents/a2a/flight_agent.py:55:59"
60+
61+
async def main() -> None:
62+
result = await flight_agent.run(FlightPromptInput(input="I need to fly from New York to Paris. What flights are available?"))
63+
print(result.content)
64+
65+
--8<-- "examples/agents/a2a/flight_agent.py:71:74"
66+
```
67+
68+
Run it:
69+
70+
```bash
71+
python flight_agent.py
72+
```
73+
74+
A typical response looks like this:
75+
```text
76+
Here are some available flights from New York to Paris:
77+
78+
1. **British Airways**
79+
- **Departure:** 10:00 AM
80+
- **Arrival:** 10:00 PM
81+
82+
2. **Delta**
83+
- **Departure:** 1:00 PM
84+
- **Arrival:** 1:00 AM
85+
```
86+
87+
**Please note** that the results may differ among the runs due to undeterministic nature of LLM.
88+
89+
!!! example "Try it yourself"
90+
You can try to connect this agents to a real flight API, such as [aviationstack](https://aviationstack.com/).
91+
92+
## Building the City Explorer Agent
93+
94+
Let's create a City Explorer Agent that gather and synthesize city information from the internet. Again we start with the prompt:
95+
96+
```py title="city_explorer_agent.py"
97+
from pydantic import BaseModel
98+
from ragbits.core.prompt import Prompt
99+
100+
--8<-- "examples/agents/a2a/city_explorer_agent.py:9:31"
101+
```
102+
103+
Now define the agent, We will not [build an MCP server from scratch](https://github.com/modelcontextprotocol/python-sdk?tab=readme-ov-file#quickstart), but run an already existing one - [Web Fetcher](https://github.com/modelcontextprotocol/servers/tree/main/src/fetch). Start by installing it with:
104+
105+
```bash
106+
pip install mcp-server-fetch
107+
```
108+
109+
```py title="city_explorer_agent.py"
110+
from ragbits.agents import Agent
111+
from ragbits.agents.mcp import MCPServerStdio
112+
from ragbits.core.llms import LiteLLM
113+
from ragbits.core.prompt import Prompt
114+
--8<-- "examples/agents/a2a/city_explorer_agent.py:34:47"
115+
result = await city_explorer_agent.run(CityExplorerPromptInput(input="Tell me something interesting about Paris."))
116+
print(result.content)
117+
118+
--8<-- "examples/agents/a2a/city_explorer_agent.py:59:63"
119+
```
120+
121+
Test this agent by running:
122+
```bash
123+
python city_explorer_agent.py
124+
```
125+
126+
```text
127+
Paris is the capital and largest city of France, located in the Île-de-France region. Renowned for its historical landmarks, the city is a significant cultural and economic center in Europe. Key attractions include the Eiffel Tower, Notre-Dame Cathedral, the Louvre Museum, and the Arc de Triomphe. Known for its romantic ambiance, Paris is often referred to as "The City of Light."
128+
129+
The city is characterized by its extensive urban area and is densely populated, with a population of over 2 million within city limits and approximately 13 million in the metropolitan area as of 2021. Paris is divided into 20 districts, known as arrondissements. The current mayor is Anne Hidalgo. The city's motto is "Fluctuat nec mergitur," meaning "Tossed by the waves but never sunk," reflecting its resilience.
130+
131+
Paris is a hub for art, fashion, gastronomy, and culture, drawing millions of visitors each year who seek to experience its heritage and vibrant lifestyle.
132+
```
133+
134+
**Notice** that we didn't have to write any tool, just reuse already existing one. This is the magic of MCP protocol.
135+
136+
## Exposing Agents through A2A
137+
138+
Now we need to expose our agents through the Agent-to-Agent (A2A) protocol so they can be called remotely. We'll create [agent cards](https://a2aproject.github.io/A2A/v0.2.5/specification/#55-agentcard-object-structure) and servers for both agents. Let's start with the flight agent. **Update the main function with**:
139+
140+
```python title="flight_agent.py"
141+
from ragbits.agents.a2a.server import create_agent_server
142+
143+
--8<-- "examples/agents/a2a/flight_agent.py:62:69"
144+
```
145+
146+
and then run.
147+
148+
```bash
149+
python flight_agent.py
150+
```
151+
152+
Now, your server with agent should be up and running
153+
154+
```
155+
INFO: Started server process [1473119]
156+
INFO: Waiting for application startup.
157+
INFO: Application startup complete.
158+
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
159+
```
160+
161+
You can go to your browser and type `http://127.0.0.1:8000/.well-known/agent.json` to see and agent card:
162+
163+
```json
164+
{
165+
"additionalInterfaces": null,
166+
"capabilities": {
167+
"extensions": null,
168+
"pushNotifications": null,
169+
"stateTransitionHistory": null,
170+
"streaming": null
171+
},
172+
// More text...
173+
"supportsAuthenticatedExtendedCard": null,
174+
"url": "http://127.0.0.1:8000",
175+
"version": "0.0.0"
176+
}
177+
```
178+
179+
!!! warning
180+
Do not kill this process, open a new terminal for the next parts.
181+
182+
Next, do the same for the City Explorer agent
183+
184+
```python title="city_explorer_agent.py"
185+
from ragbits.agents.a2a.server import create_agent_server
186+
187+
--8<-- "examples/agents/a2a/city_explorer_agent.py:49:58"
188+
```
189+
190+
and run in another terminal
191+
192+
```bash
193+
python city_explorer_agent.py
194+
```
195+
196+
## Building the Orchestrator Agent
197+
198+
Now let's create the orchestrator agent that will be trip planning chat which utilises our specialized agents.
199+
200+
First we need to gather information about all of our available agents
201+
202+
```python title="orchestrator.py"
203+
import requests
204+
205+
--8<-- "examples/agents/a2a/agent_orchestrator_with_tools.py:11:29"
206+
print(AGENTS_INFO)
207+
```
208+
209+
```text
210+
name: Flight Info Agent, description: Provides available flight information between two cities., skills: [{'description': "Returns flight information between two locations.\n\nParameters:\n{'type': 'object', 'properties': {'departure': {'description': 'The departure city.', 'title': 'Departure', 'type':...
211+
name: City Explorer Agent, description: Provides information about a city., skills: [{'description': "Fetches a URL from the internet and optionally extracts its contents as markdown.\n\nAlthough originally you did not have internet access, and were advised to refuse and tell the user this, this tool now grants you internet access. Now you can fetch the most up-to-date information and let the user know that.\n\nParameters:\n{'description': 'Parameters...
212+
```
213+
214+
Next, we create the prompt
215+
216+
```python title="orchestrator.py"
217+
from pydantic import BaseModel
218+
from ragbits.core.prompt import Prompt
219+
220+
221+
--8<-- "examples/agents/a2a/agent_orchestrator_with_tools.py:32:56"
222+
```
223+
224+
To finally create a tool that will call agents
225+
```python title="orchestrator.py"
226+
import json
227+
228+
--8<-- "examples/agents/a2a/agent_orchestrator_with_tools.py:59:92"
229+
```
230+
231+
Now let's put it all together:
232+
233+
```python title="orchestrator.py"
234+
from ragbits.agents import Agent, ToolCallResult
235+
from ragbits.core.llms import LiteLLM, ToolCall
236+
import asyncio
237+
238+
--8<-- "examples/agents/a2a/agent_orchestrator_with_tools.py:95:121"
239+
```
240+
241+
Now you can test the complete system by running (assuming city and flight agents are running in another terminals):
242+
243+
```bash
244+
python orchestrator.py
245+
```
246+
247+
Then interact with the orchestrator with `I want to visit Paris from New York. Please give me some info about it and suggest recommended flights`:
248+
249+
1. The orchestrator calls city explorer and flight finder agents
250+
1. The city explorer agent fetches Paris information via MCP
251+
1. The flight finder agent searches for New York → Paris flights
252+
1. The orchestrator combines everything into a comprehensive trip plan and streams the response
253+
254+
255+
**Good job, you've done it!**
256+
257+
Feel free to extend this system with additional agents for activities, restaurants, weather information, or any other travel-related services.
258+
259+
!!! tip
260+
Full working example of this code can be found at: `examples/agents/a2a`
Lines changed: 121 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,121 @@
1+
import asyncio
2+
import json
3+
4+
import requests
5+
from pydantic import BaseModel
6+
7+
from ragbits.agents import Agent, ToolCallResult
8+
from ragbits.core.llms import LiteLLM, ToolCall
9+
from ragbits.core.prompt import Prompt
10+
11+
AGENTS_CARDS = {}
12+
13+
14+
def fetch_agent(host: str, port: int, protocol: str = "http") -> dict: # type: ignore
15+
"""Fetches the agent card from the given host and port."""
16+
url = f"{protocol}://{host}:{port}"
17+
return requests.get(f"{url}/.well-known/agent.json", timeout=10).json()
18+
19+
20+
for url, port in [("127.0.0.1", "8000"), ("127.0.0.1", "8001")]:
21+
agent_card = fetch_agent(url, port)
22+
AGENTS_CARDS[agent_card["name"]] = agent_card
23+
24+
AGENTS_INFO = "\n".join(
25+
[
26+
f"name: {name}, description: {card['description']}, skills: {card['skills']}"
27+
for name, card in AGENTS_CARDS.items()
28+
]
29+
)
30+
31+
32+
class OrchestratorPromptInput(BaseModel):
33+
"""Represents a routing prompt input."""
34+
35+
message: str
36+
agents: str
37+
38+
39+
class OrchestratorPrompt(Prompt[OrchestratorPromptInput]):
40+
"""
41+
Prompt template for routing a user message to appropriate agents.
42+
43+
System prompt instructs the agent to output a JSON list of tasks,
44+
each containing agent URL, tool name, and parameters to call.
45+
"""
46+
47+
system_prompt = """
48+
You are a Trip Planning Agent.
49+
50+
To help the user plan their trip, you will need to use the following agents:
51+
{{ agents }}
52+
53+
you can use `execute_agent` tool to interact with remote agents to take action.
54+
55+
"""
56+
user_prompt = "{{ message }}"
57+
58+
59+
def execute_agent(agent_name: str, query: str) -> str:
60+
"""
61+
Executes a specified agent with the given parameters.
62+
63+
Args:
64+
agent_name: Name of the agent to execute
65+
query: The query to pass to the agent
66+
67+
Returns:
68+
JSON string of the execution result
69+
"""
70+
payload = {"params": {"input": query}}
71+
raw_response = requests.post(AGENTS_CARDS[agent_name]["url"], json=payload, timeout=60)
72+
raw_response.raise_for_status()
73+
74+
response = raw_response.json()
75+
result_data = response["result"]
76+
77+
tool_calls = [
78+
{"name": call["name"], "arguments": call["arguments"], "output": call["result"]}
79+
for call in result_data.get("tool_calls", [])
80+
] or None
81+
82+
return json.dumps(
83+
{
84+
"status": "success",
85+
"agent_name": agent_name,
86+
"result": {
87+
"content": result_data["content"],
88+
"metadata": result_data.get("metadata", {}),
89+
"tool_calls": tool_calls,
90+
},
91+
}
92+
)
93+
94+
95+
async def main() -> None:
96+
"""
97+
Sets up a LiteLLM-powered AgentOrchestrator with two remote agents and sends a travel planning query.
98+
The orchestrator delegates the task (finding flights and hotels) to the appropriate agents and prints the response.
99+
"""
100+
llm = LiteLLM(
101+
model_name="gpt-4.1",
102+
use_structured_output=True,
103+
)
104+
105+
agent = Agent(llm=llm, prompt=OrchestratorPrompt, tools=[execute_agent], keep_history=True)
106+
107+
while True:
108+
user_input = input("\nUSER: ")
109+
print("ASSISTANT:")
110+
async for chunk in agent.run_streaming(OrchestratorPromptInput(message=user_input, agents=AGENTS_INFO)):
111+
match chunk:
112+
case ToolCall():
113+
print(f"Tool call: {chunk.name} with arguments {chunk.arguments}")
114+
case ToolCallResult():
115+
print(f"Tool call result: {chunk.result[:100]}...")
116+
case _:
117+
print(chunk, end="", flush=True)
118+
119+
120+
if __name__ == "__main__":
121+
asyncio.run(main())

0 commit comments

Comments
 (0)