Skip to content

Commit 04f6250

Browse files
committed
doc: Add README.MD
1 parent 1ab6803 commit 04f6250

File tree

2 files changed

+558
-0
lines changed

2 files changed

+558
-0
lines changed
Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
# Example: Using a2a-python SDK Without an LLM Framework
2+
3+
This repository demonstrates how to set up and use the [a2a-python SDK](https://github.com/google/a2a-python) to create a simple server and client, without relying on any large language model (LLM) framework.
4+
5+
## Overview
6+
7+
- **A2A (Agent-to-Agent):** A protocol and SDK for building interoperable AI agents.
8+
- **This Example:** Shows how to run a basic A2A server and client, exchange messages, and view the response.
9+
10+
## Prerequisites
11+
12+
- Python 3.13+
13+
- [uv](https://github.com/astral-sh/uv) (for fast dependency management and running)
14+
- An API key for Gemini (set as `GEMINI_API_KEY`)
15+
16+
## Installation
17+
18+
1. **Clone the repository:**
19+
```bash
20+
git clone <this-repo-url>
21+
cd <repo-directory>
22+
```
23+
24+
2. **Install dependencies:**
25+
```bash
26+
uv pip install -e .
27+
```
28+
29+
3. **Set environment variables:**
30+
```bash
31+
export GEMINI_API_KEY=your-gemini-api-key
32+
```
33+
34+
Or create a `.env` file with:
35+
```
36+
GEMINI_API_KEY=your-gemini-api-key
37+
```
38+
39+
## Running the Example
40+
41+
### 1. Start the Server
42+
43+
```bash
44+
uv run --env-file .env python -m src.no_llm_framework.server.__main__
45+
```
46+
- The server will start on port 9999.
47+
48+
### 2. Run the Client
49+
50+
In a new terminal:
51+
52+
```bash
53+
uv run --env-file .env python -m src.no_llm_framework.client --question "What is A2A protocol?"
54+
```
55+
56+
- The client will connect to the server and send a request.
57+
58+
### 3. View the Response
59+
60+
- The response from the client will be saved to [response.xml](./response.xml).
61+
62+
## File Structure
63+
64+
- `src/no_llm_framework/server/`: Server implementation.
65+
- `src/no_llm_framework/client/`: Client implementation.
66+
- `response.xml`: Example response from the client.
67+
68+
## Troubleshooting
69+
70+
- **Missing dependencies:** Make sure you have `uv` installed.
71+
- **API key errors:** Ensure `GEMINI_API_KEY` is set correctly.
72+
- **Port conflicts:** Make sure port 9999 is free.

0 commit comments

Comments
 (0)