Skip to content

Commit 3722064

Browse files
Rana Umar MajeedRana Umar Majeed
authored andcommitted
Update readme and add setup to run-dojo
1 parent 07c3c6a commit 3722064

File tree

5 files changed

+324
-128
lines changed

5 files changed

+324
-128
lines changed

apps/dojo/scripts/prep-dojo-everything.js

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -97,6 +97,11 @@ const ALL_TARGETS = {
9797
name: "Pydantic AI",
9898
cwd: path.join(integrationsRoot, "pydantic-ai/python/examples"),
9999
},
100+
"aws-strands": {
101+
command: "poetry install",
102+
name: "AWS Strands",
103+
cwd: path.join(integrationsRoot, "aws-strands/python/examples"),
104+
},
100105
"adk-middleware": {
101106
command: "uv sync",
102107
name: "ADK Middleware",

apps/dojo/scripts/run-dojo-everything.js

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -111,6 +111,12 @@ const ALL_SERVICES = {
111111
cwd: path.join(integrationsRoot, 'pydantic-ai/python/examples'),
112112
env: { PORT: 8009 },
113113
}],
114+
'aws-strands': [{
115+
command: 'poetry run dev',
116+
name: 'AWS Strands',
117+
cwd: path.join(integrationsRoot, 'aws-strands/python/examples'),
118+
env: { PORT: 8017 },
119+
}],
114120
'adk-middleware': [{
115121
command: 'uv run dev',
116122
name: 'ADK Middleware',
@@ -174,6 +180,7 @@ const ALL_SERVICES = {
174180
A2A_MIDDLEWARE_FINANCE_URL: 'http://localhost:8012',
175181
A2A_MIDDLEWARE_IT_URL: 'http://localhost:8013',
176182
A2A_MIDDLEWARE_ORCHESTRATOR_URL: 'http://localhost:8014',
183+
AWS_STRANDS_URL: 'http://localhost:8017',
177184
NEXT_PUBLIC_CUSTOM_DOMAIN_TITLE: 'cpkdojo.local___CopilotKit Feature Viewer',
178185
},
179186
}],
Lines changed: 56 additions & 43 deletions
Original file line numberDiff line numberDiff line change
@@ -1,69 +1,82 @@
1-
# Strands Integration (OpenAI)
1+
# AWS Strands Example Server
22

3-
This integration demonstrates how to use Strands Agents SDK with OpenAI models and AG-UI protocol.
3+
Demo FastAPI server that wires the Strands Agents SDK (Gemini models) into the
4+
AG-UI protocol. Each route mounts a ready-made agent that showcases different UI
5+
patterns (vanilla chat, backend tool rendering, shared state, and generative UI).
46

5-
## Prerequisites
7+
## Requirements
68

7-
- Python 3.12 or later
8-
- Poetry for dependency management
9-
- OpenAI API key
10-
- Strands Agents SDK with OpenAI support installed
9+
- Python 3.12 or 3.13 (the project is pinned to `<3.14`)
10+
- Poetry 1.8+ (ships with the repo via `curl -sSL https://install.python-poetry.org | python3 -`)
11+
- Google API key with access to Gemini 2.5 Flash (set as `GOOGLE_API_KEY`)
12+
- (Optional) AG-UI repo running locally so you can point the Dojo at these routes
1113

12-
## Setup
14+
## Quick start
1315

14-
1. Install Strands SDK with OpenAI support:
1516
```bash
16-
pip install 'strands-agents[openai]'
17+
cd integrations/aws-strands/python/examples
18+
19+
# pick a supported interpreter if your global default is 3.14
20+
poetry env use python3.13
21+
22+
poetry install
1723
```
1824

19-
2. Configure OpenAI API key:
25+
Create a `.env` file in this folder (same dir as `pyproject.toml`) so every
26+
example can load credentials automatically:
27+
2028
```bash
21-
# Set your OpenAI API key (required)
22-
export OPENAI_API_KEY=your-api-key-here
29+
GOOGLE_API_KEY=your-gemini-key
30+
# Optional overrides
31+
PORT=8000 # FastAPI listen port
2332
```
2433

25-
3. Optional: Configure OpenAI model settings:
26-
```bash
27-
# Set the OpenAI model to use (default: gpt-4o)
28-
export OPENAI_MODEL=gpt-4o
34+
> The sample agents default to `gemini-2.5-flash` and already set sensible
35+
> temperature/token parameters; override only if you need a different tier.
2936
30-
# Set max tokens (default: 2000)
31-
export OPENAI_MAX_TOKENS=2000
37+
## Running the demo server
3238

33-
# Set temperature (default: 0.7)
34-
export OPENAI_TEMPERATURE=0.7
35-
```
39+
Either command exposes all mounted apps on `http://localhost:${PORT:-8000}`:
3640

37-
4. Install dependencies:
3841
```bash
39-
cd integrations/aws-strands-integration/python/examples
40-
poetry install
42+
poetry run dev # uses the Poetry script entry point (server:main)
43+
# or
44+
poetry run python -m server
4145
```
4246

43-
## Running the server
47+
The root route lists the available demos:
4448

45-
To run the server:
49+
| Route | Description |
50+
| --- | --- |
51+
| `/agentic-chat` | Simple chat agent with a frontend-only `change_background` tool |
52+
| `/backend-tool-rendering` | Backend-executed tools (charts, faux weather) rendered in AG-UI |
53+
| `/agentic-generative-ui` | Demonstrates `PredictState` + delta streaming for plan tracking |
54+
| `/shared-state` | Recipe builder showing shared JSON state + tool arguments |
4655

47-
```bash
48-
cd integrations/aws-strands-integration/python/examples
56+
Point the AG-UI Dojo (or any AG-UI client) at these SSE endpoints to see the
57+
Strands wrapper translate Gemini events into protocol-native messages.
4958

50-
poetry install && poetry run dev
51-
```
59+
## Environment reference
60+
61+
| Variable | Required | Purpose |
62+
| --- | --- | --- |
63+
| `GOOGLE_API_KEY` | Yes | Auth for the Gemini SDK (`strands.models.gemini.GeminiModel`) |
64+
| `PORT` | No | Overrides the default `8000` uvicorn port |
5265

53-
The server will start on `http://localhost:8000` by default. You can change the port by setting the `PORT` environment variable.
66+
All OpenTelemetry exporters are disabled by default in code (`OTEL_SDK_DISABLED`
67+
and `OTEL_PYTHON_DISABLED_INSTRUMENTATIONS`), so you do not need to set those
68+
manually.
5469

55-
## Integration Details
70+
## How it works
5671

57-
This integration uses the Strands Agents SDK with OpenAI models. The server:
58-
- Accepts AG-UI protocol requests
59-
- Connects to OpenAI models via Strands SDK
60-
- Streams responses back as AG-UI events
61-
- Handles tool calls and state management
72+
- Each `server/api/*.py` file constructs a Strands `Agent`, registers any tools,
73+
and wraps it with `ag_ui_strands.StrandsAgent`.
74+
- `server/__init__.py` mounts the four FastAPI apps under a single router and
75+
exposes the `main()` entrypoint that `poetry run dev` calls.
76+
- The project depends on `ag_ui_strands` via a path dependency (`..`) so you can
77+
develop the integration and server side-by-side without publishing a wheel.
78+
- Want a different Gemini tier? Update the `model_id` argument in the agent
79+
definitions inside `server/api/*.py`.
6280

63-
## Notes
6481

65-
- The integration uses OpenAI models (default: gpt-4o)
66-
- Ensure your OpenAI API key is valid and has access to the specified model
67-
- The integration supports streaming responses when available in the Strands SDK
68-
- You can customize the model, max_tokens, and temperature via environment variables
6982

integrations/aws-strands/python/examples/pyproject.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ version = "0.1.0"
44
description = "Strands integration server for AG-UI using OpenAI models"
55
authors = ["AG-UI Contributors"]
66
readme = "README.md"
7-
package-mode = false
7+
packages = [{ include = "server" }]
88

99
[tool.poetry.dependencies]
1010
python = "<3.14,>=3.12"
@@ -20,4 +20,4 @@ requires = ["poetry-core"]
2020
build-backend = "poetry.core.masonry.api"
2121

2222
[tool.poetry.scripts]
23-
dev = "aws_strands_server:main"
23+
dev = "server:main"

0 commit comments

Comments
 (0)