|
1 |
| -# Pydantic AI |
| 1 | +# ADK Middleware for AG-UI Protocol |
2 | 2 |
|
3 |
| -Implementation of the AG-UI protocol for [Pydantic AI](https://ai.pydantic.dev/). |
4 |
| - |
5 |
| -For more information on the Pydantic AI implementation see |
6 |
| -the [Pydantic AI AG-UI docs](https://ai.pydantic.dev/ag-ui/). |
| 3 | +This Python middleware enables [Google ADK](https://google.github.io/adk-docs/) agents to be used with the AG-UI Protocol, providing a bridge between the two frameworks. |
7 | 4 |
|
8 | 5 | ## Prerequisites
|
9 | 6 |
|
10 |
| -This example uses a Pydantic AI agent using an OpenAI model and the AG-UI dojo. |
| 7 | +The examples use ADK Agents using various Gemini models along with the AG-UI Dojo. |
11 | 8 |
|
12 |
| -- An [OpenAI API key](https://help.openai.com/en/articles/4936850-where-do-i-find-my-openai-api-key) |
| 9 | +- A [Gemini API Key](https://makersuite.google.com/app/apikey). The examples assume that this is exported via the GOOGLE_API_KEY environment variable. |
13 | 10 |
|
14 |
| -## Running |
| 11 | +## Quick Start |
15 | 12 |
|
16 |
| -To run this integration you need to: |
| 13 | +To use this integration you need to: |
17 | 14 |
|
18 |
| -1. Clone the [AG-UI repository](https://github.com/ag-ui-protocol/ag-ui) |
| 15 | +1. Clone the [AG-UI repository](https://github.com/ag-ui-protocol/ag-ui). |
19 | 16 |
|
20 |
| - ```shell |
| 17 | + ```bash |
21 | 18 | git clone https://github.com/ag-ui-protocol/ag-ui.git
|
22 | 19 | ```
|
23 | 20 |
|
24 |
| -2. Change into the `typescript-sdk/integrations/pydantic-ai` directory |
| 21 | +2. Change to the `typescript-sdk/integrations/adk-middleware` directory. |
25 | 22 |
|
26 |
| - ```shell |
27 |
| - cd typescript-sdk/integrations/pydantic-ai |
| 23 | + ```bash |
| 24 | + cd typescript-sdk/integrations/adk-middleware |
28 | 25 | ```
|
29 | 26 |
|
30 |
| -3. Install the `pydantic-ai-examples` package, for example: |
| 27 | +3. Install the `adk-middleware` package from the local directory. For example, |
31 | 28 |
|
32 |
| - ```shell |
33 |
| - pip install pydantic-ai-examples |
| 29 | + ```bash |
| 30 | + pip install . |
34 | 31 | ```
|
35 | 32 |
|
36 |
| - or: |
| 33 | + or |
37 | 34 |
|
38 |
| - ```shell |
39 |
| - uv venv |
40 |
| - uv pip install pydantic-ai-examples |
| 35 | + ```bash |
| 36 | + uv pip install . |
41 | 37 | ```
|
| 38 | + |
| 39 | + This installs the package from the current directory which contains: |
| 40 | + - `src/adk_middleware/` - The middleware source code |
| 41 | + - `examples/` - Example servers and agents |
| 42 | + - `tests/` - Test suite |
42 | 43 |
|
43 |
| -4. Run the example dojo server |
| 44 | +4. Install the requirements for the `examples`, for example: |
44 | 45 |
|
45 |
| - ```shell |
46 |
| - export OPENAI_API_KEY=<your api key> |
47 |
| - python -m pydantic_ai_examples.ag_ui |
| 46 | + ```bash |
| 47 | + uv pip install -r requirements.txt |
48 | 48 | ```
|
49 | 49 |
|
50 |
| - or: |
| 50 | +5. Run the example fast_api server. |
51 | 51 |
|
52 |
| - ```shell |
53 |
| - export OPENAI_API_KEY=<your api key> |
54 |
| - uv run python -m pydantic_ai_examples.ag_ui |
| 52 | + ```bash |
| 53 | + export GOOGLE_API_KEY=<My API Key> |
| 54 | + cd examples |
| 55 | + uv sync |
| 56 | + uv run dev |
55 | 57 | ```
|
56 | 58 |
|
57 |
| -5. Open another terminal in root directory of the `ag-ui` repository clone |
58 |
| -6. Start the integration ag-ui dojo: |
| 59 | +6. Open another terminal in the root directory of the ag-ui repository clone. |
| 60 | + |
| 61 | +7. Start the integration ag-ui dojo: |
59 | 62 |
|
60 |
| - ```shell |
| 63 | + ```bash |
61 | 64 | cd typescript-sdk
|
62 | 65 | pnpm install && pnpm run dev
|
63 | 66 | ```
|
64 | 67 |
|
65 |
| -7. Visit [http://localhost:3000/pydantic-ai](http://localhost:3000/pydantic-ai) |
66 |
| -8. Select View `Pydantic AI` from the sidebar |
67 |
| - |
68 |
| - |
69 |
| -## Feature Examples |
70 |
| - |
71 |
| -### Agentic Chat |
72 |
| - |
73 |
| -This demonstrates a basic agent interaction including Pydantic AI server side |
74 |
| -tools and AG-UI client side tools. |
75 |
| - |
76 |
| -View the [Agentic Chat example](http://localhost:3000/pydantic-ai/feature/agentic_chat). |
| 68 | +8. Visit [http://localhost:3000/adk-middleware](http://localhost:3000/adk-middleware). |
77 | 69 |
|
78 |
| -#### Agent Tools |
| 70 | +9. Select View `ADK Middleware` from the sidebar. |
79 | 71 |
|
80 |
| -- `time` - Pydantic AI tool to check the current time for a time zone |
81 |
| -- `background` - AG-UI tool to set the background color of the client window |
| 72 | +### Development Setup |
82 | 73 |
|
83 |
| -#### Agent Prompts |
| 74 | +If you want to contribute to ADK Middleware development, you'll need to take some additional steps. You can either use the following script of the manual development setup. |
84 | 75 |
|
85 |
| -```text |
86 |
| -What is the time in New York? |
| 76 | +```bash |
| 77 | +# From the adk-middleware directory |
| 78 | +chmod +x setup_dev.sh |
| 79 | +./setup_dev.sh |
87 | 80 | ```
|
88 | 81 |
|
89 |
| -```text |
90 |
| -Change the background to blue |
91 |
| -``` |
| 82 | +### Manual Development Setup |
| 83 | +
|
| 84 | +```bash |
| 85 | +# Create virtual environment |
| 86 | +python -m venv venv |
| 87 | +source venv/bin/activate |
92 | 88 |
|
93 |
| -A complex example which mixes both AG-UI and Pydantic AI tools: |
| 89 | +# Install this package in editable mode |
| 90 | +pip install -e . |
94 | 91 |
|
95 |
| -```text |
96 |
| -Perform the following steps, waiting for the response of each step before continuing: |
97 |
| -1. Get the time |
98 |
| -2. Set the background to red |
99 |
| -3. Get the time |
100 |
| -4. Report how long the background set took by diffing the two times |
| 92 | +# For development (includes testing and linting tools) |
| 93 | +pip install -e ".[dev]" |
| 94 | +# OR |
| 95 | +pip install -r requirements-dev.txt |
101 | 96 | ```
|
102 | 97 |
|
103 |
| -### Agentic Generative UI |
| 98 | +This installs the ADK middleware in editable mode for development. |
104 | 99 |
|
105 |
| -Demonstrates a long running task where the agent sends updates to the frontend |
106 |
| -to let the user know what's happening. |
| 100 | +## Testing |
107 | 101 |
|
108 |
| -View the [Agentic Generative UI example](http://localhost:3000/pydantic-ai/feature/agentic_generative_ui). |
| 102 | +```bash |
| 103 | +# Run tests (271 comprehensive tests) |
| 104 | +pytest |
109 | 105 |
|
110 |
| -#### Plan Prompts |
| 106 | +# With coverage |
| 107 | +pytest --cov=src/adk_middleware |
111 | 108 |
|
112 |
| -```text |
113 |
| -Create a plan for breakfast and execute it |
| 109 | +# Specific test file |
| 110 | +pytest tests/test_adk_agent.py |
| 111 | +``` |
| 112 | +## Usage options |
| 113 | +
|
| 114 | +### Option 1: Direct Usage |
| 115 | +```python |
| 116 | +from adk_middleware import ADKAgent |
| 117 | +from google.adk.agents import Agent |
| 118 | +
|
| 119 | +# 1. Create your ADK agent |
| 120 | +my_agent = Agent( |
| 121 | + name="assistant", |
| 122 | + instruction="You are a helpful assistant." |
| 123 | +) |
| 124 | +
|
| 125 | +# 2. Create the middleware with direct agent embedding |
| 126 | +agent = ADKAgent( |
| 127 | + adk_agent=my_agent, |
| 128 | + app_name="my_app", |
| 129 | + user_id="user123" |
| 130 | +) |
| 131 | +
|
| 132 | +# 3. Use directly with AG-UI RunAgentInput |
| 133 | +async for event in agent.run(input_data): |
| 134 | + print(f"Event: {event.type}") |
114 | 135 | ```
|
115 | 136 |
|
116 |
| -### Human in the Loop |
| 137 | +### Option 2: FastAPI Server |
117 | 138 |
|
118 |
| -Demonstrates simple human in the loop workflow where the agent comes up with a |
119 |
| -plan and the user can approve it using checkboxes. |
| 139 | +```python |
| 140 | +from fastapi import FastAPI |
| 141 | +from adk_middleware import ADKAgent, add_adk_fastapi_endpoint |
| 142 | +from google.adk.agents import Agent |
120 | 143 |
|
121 |
| -#### Task Planning Tools |
| 144 | +# 1. Create your ADK agent |
| 145 | +my_agent = Agent( |
| 146 | + name="assistant", |
| 147 | + instruction="You are a helpful assistant." |
| 148 | +) |
122 | 149 |
|
123 |
| -- `generate_task_steps` - AG-UI tool to generate and confirm steps |
| 150 | +# 2. Create the middleware with direct agent embedding |
| 151 | +agent = ADKAgent( |
| 152 | + adk_agent=my_agent, |
| 153 | + app_name="my_app", |
| 154 | + user_id="user123" |
| 155 | +) |
124 | 156 |
|
125 |
| -#### Task Planning Prompt |
| 157 | +# 3. Create FastAPI app |
| 158 | +app = FastAPI() |
| 159 | +add_adk_fastapi_endpoint(app, agent, path="/chat") |
126 | 160 |
|
127 |
| -```text |
128 |
| -Generate a list of steps for cleaning a car for me to review |
| 161 | +# Run with: uvicorn your_module:app --host 0.0.0.0 --port 8000 |
129 | 162 | ```
|
130 | 163 |
|
131 |
| -### Predictive State Updates |
132 |
| -
|
133 |
| -Demonstrates how to use the predictive state updates feature to update the state |
134 |
| -of the UI based on agent responses, including user interaction via user |
135 |
| -confirmation. |
136 |
| -
|
137 |
| -View the [Predictive State Updates example](http://localhost:3000/pydantic-ai/feature/predictive_state_updates). |
138 |
| -
|
139 |
| -#### Story Tools |
| 164 | +For detailed configuration options, see [CONFIGURATION.md](./CONFIGURATION.md) |
140 | 165 |
|
141 |
| -- `write_document` - AG-UI tool to write the document to a window |
142 |
| -- `document_predict_state` - Pydantic AI tool that enables document state |
143 |
| - prediction for the `write_document` tool |
144 | 166 |
|
145 |
| -This also shows how to use custom instructions based on shared state information. |
| 167 | +## Running the ADK Backend Server for Dojo App |
146 | 168 |
|
147 |
| -#### Story Example |
| 169 | +To run the ADK backend server that works with the Dojo app, use the following command: |
148 | 170 |
|
149 |
| -Starting document text |
150 |
| -
|
151 |
| -```markdown |
152 |
| -Bruce was a good dog, |
| 171 | +```bash |
| 172 | +python -m examples.fastapi_server |
153 | 173 | ```
|
154 | 174 |
|
155 |
| -Agent prompt |
156 |
| -
|
157 |
| -```text |
158 |
| -Help me complete my story about bruce the dog, is should be no longer than a sentence. |
| 175 | +This will start a FastAPI server that connects your ADK middleware to the Dojo application. |
| 176 | +
|
| 177 | +## Examples |
| 178 | +
|
| 179 | +### Simple Conversation |
| 180 | +
|
| 181 | +```python |
| 182 | +import asyncio |
| 183 | +from adk_middleware import ADKAgent |
| 184 | +from google.adk.agents import Agent |
| 185 | +from ag_ui.core import RunAgentInput, UserMessage |
| 186 | +
|
| 187 | +async def main(): |
| 188 | + # Setup |
| 189 | + my_agent = Agent(name="assistant", instruction="You are a helpful assistant.") |
| 190 | + |
| 191 | + agent = ADKAgent( |
| 192 | + adk_agent=my_agent, |
| 193 | + app_name="demo_app", |
| 194 | + user_id="demo" |
| 195 | + ) |
| 196 | + |
| 197 | + # Create input |
| 198 | + input = RunAgentInput( |
| 199 | + thread_id="thread_001", |
| 200 | + run_id="run_001", |
| 201 | + messages=[ |
| 202 | + UserMessage(id="1", role="user", content="Hello!") |
| 203 | + ], |
| 204 | + context=[], |
| 205 | + state={}, |
| 206 | + tools=[], |
| 207 | + forwarded_props={} |
| 208 | + ) |
| 209 | + |
| 210 | + # Run and handle events |
| 211 | + async for event in agent.run(input): |
| 212 | + print(f"Event: {event.type}") |
| 213 | + if hasattr(event, 'delta'): |
| 214 | + print(f"Content: {event.delta}") |
| 215 | +
|
| 216 | +asyncio.run(main()) |
159 | 217 | ```
|
160 | 218 |
|
161 |
| -### Shared State |
162 |
| -
|
163 |
| -Demonstrates how to use the shared state between the UI and the agent. |
164 |
| -
|
165 |
| -State sent to the agent is detected by a function based instruction. This then |
166 |
| -validates the data using a custom pydantic model before using to create the |
167 |
| -instructions for the agent to follow and send to the client using a AG-UI tool. |
168 |
| -
|
169 |
| -View the [Shared State example](http://localhost:3000/pydantic-ai/feature/shared_state). |
170 |
| -
|
171 |
| -#### Recipe Tools |
172 |
| -
|
173 |
| -- `display_recipe` - AG-UI tool to display the recipe in a graphical format |
174 |
| -
|
175 |
| -#### Recipe Example |
176 |
| -
|
177 |
| -1. Customise the basic settings of your recipe |
178 |
| -2. Click `Improve with AI` |
179 |
| -
|
180 |
| -### Tool Based Generative UI |
181 |
| -
|
182 |
| -Demonstrates customised rendering for tool output with used confirmation. |
| 219 | +### Multi-Agent Setup |
| 220 | +
|
| 221 | +```python |
| 222 | +# Create multiple agent instances with different ADK agents |
| 223 | +general_agent_wrapper = ADKAgent( |
| 224 | + adk_agent=general_agent, |
| 225 | + app_name="demo_app", |
| 226 | + user_id="demo" |
| 227 | +) |
| 228 | +
|
| 229 | +technical_agent_wrapper = ADKAgent( |
| 230 | + adk_agent=technical_agent, |
| 231 | + app_name="demo_app", |
| 232 | + user_id="demo" |
| 233 | +) |
| 234 | +
|
| 235 | +creative_agent_wrapper = ADKAgent( |
| 236 | + adk_agent=creative_agent, |
| 237 | + app_name="demo_app", |
| 238 | + user_id="demo" |
| 239 | +) |
| 240 | +
|
| 241 | +# Use different endpoints for each agent |
| 242 | +from fastapi import FastAPI |
| 243 | +from adk_middleware import add_adk_fastapi_endpoint |
| 244 | +
|
| 245 | +app = FastAPI() |
| 246 | +add_adk_fastapi_endpoint(app, general_agent_wrapper, path="/agents/general") |
| 247 | +add_adk_fastapi_endpoint(app, technical_agent_wrapper, path="/agents/technical") |
| 248 | +add_adk_fastapi_endpoint(app, creative_agent_wrapper, path="/agents/creative") |
| 249 | +``` |
183 | 250 |
|
184 |
| -View the [Tool Based Generative UI example](http://localhost:3000/pydantic-ai/feature/tool_based_generative_ui). |
| 251 | +## Tool Support |
185 | 252 |
|
186 |
| -#### Haiku Tools |
| 253 | +The middleware provides complete bidirectional tool support, enabling AG-UI Protocol tools to execute within Google ADK agents. All tools supplied by the client are currently implemented as long-running tools that emit events to the client for execution and can be combined with backend tools provided by the agent to create a hybrid combined toolset. |
187 | 254 |
|
188 |
| -- `generate_haiku` - AG-UI tool to display a haiku in English and Japanese |
| 255 | +For detailed information about tool support, see [TOOLS.md](./TOOLS.md). |
189 | 256 |
|
190 |
| -#### Haiku Prompt |
| 257 | +## Additional Documentation |
191 | 258 |
|
192 |
| -```text |
193 |
| -Generate a haiku about formula 1 |
194 |
| -``` |
| 259 | +- **[CONFIGURATION.md](./CONFIGURATION.md)** - Complete configuration guide |
| 260 | +- **[TOOLS.md](./TOOLS.md)** - Tool support documentation |
| 261 | +- **[USAGE.md](./USAGE.md)** - Usage examples and patterns |
| 262 | +- **[ARCHITECTURE.md](./ARCHITECTURE.md)** - Technical architecture and design details |
0 commit comments