Caution
Only view this as a WORK IN PROGRESS and educational exercise.
Project (compose.yaml
) includes:
agentgateway
- for general purpose to gain knowhow
- ✅ got it working with AWS Bedrock
- TODOs:
- learn more how to use MCPs
- route
echo
plain HTTP as gateway - try an openAPI server
- ... find time⏳️
- for general purpose to gain knowhow
tzolov/mcp-everything-server:v2
- for testing with an external MCP ✅
ghcr.io/modelcontextprotocol/inspector
- for inspecting and debugging MCP interactions ✅
echo
- for debugging and output of HTTP interaction via the gateway
- ☝️mcp-server (
source code
& dedicatedcompose.yaml
)- a custom MCP Server example in
python
to be called overagentgateway
- 👀 into the
./src
-folder it includes a custom MCP Server example
A standalone tool ('prefix') for testing and debugging as well as an example Graph using LangGraph.
- a custom MCP Server example in
LLM
s in two variantsollama
to locally hostaws
Bedrock to use cloud-hosted models
In LangGraph, you are building a graph (the application).
The nodes are connected by edges (the transitions between steps). This graph is composed of nodes (the steps). One or more of these nodes can themselves be agents (intelligent, looping subsystems). The graph is the boss that manages the agents and other workers.
If the node contains an LLM that is empowered to make decisions in a loop (typically using tools), it is an agent node. If the node uses an LLM for a single, deterministic task, it's just an "LLM" node. If it has no LLM at all, it's a function node.
-
with
aws bedrock
- fill in all
AWS_*
-vars from.env.example
& rename it to.env
- pick a matching
agentgateway
-config.yaml- reference the config in the
compose.yaml
docker compose up
- reference the config in the
- fill in all
-
with local LLM
ollama
- fetch a model for tools (
llama3.1
)
- get the Server up
docker compose run -d ollama
- enter the container
docker compose exec ollama bash
- inside pull a model
# ollama pull llama3.1
⏳️ # exit
docker compose down ollama --remove-orphans
- adapt varts from
.env.example
& rename it to.env
docker compose --profile ollama up
- get the Server up
- fetch a model for tools (
- http://localhost:6274/ (the MCP Inspector)
- as URL you can use:
- http://localhost:3000/graph (over the gateway)
- http://localhost:8000/mcp (direct call the MCP Server)
- Press
▶️ Connect- goto: http://localhost:6274/#tools & Click "List Tools"
- 👀 at the tools
- as URL you can use:
- http://localhost:15000/ui (agentgateway UI)
- goto: http://localhost:15000/ui/playground/
- select a Route (MCP) & connect
- see all Tools available
- goto: http://localhost:15000/ui/playground/
There are two tools that are not Tools but Graphs/Agents you can run/invoke Use either http://localhost:15000/ui/playground/ or http://localhost:6274/ (URL: http://localhost:8000/mcp or http://localhost:3000/graph)
simple_langgraph
–Process text using LangGraph workflowexecute_graph
–Run example Graph over MCP
OR 👇️
- Connect the VSC Debugger Python: Remote Attach to Graph Client to the
graph
-service (./src/compose.yaml
)- set Breakpoints
- watch the Graph being executed
- Uses
agentgateway
for LLM calls overaws Bedrock
- Uses the local MCP-Server examples for testing and debugging
- Uses
...
graph-1 | Tool calls: None
graph-1 | Response content: 8
...
agentgateway
vs. withoutaws
bedrock
vs. bundled localollama
- add config
- OR/AND enable this config in the compose.yaml file
- for the
aws
-credentials:
curl 'http://0.0.0.0:3000/bedrock' --header 'Content-Type: application/json' --data ' {
"model": "anthropic.claude-3-5-sonnet-20240620-v1:0",
"messages": [
{
"role": "user",
"content": "Tell me a story"
}
]
}
'
Still even works with an empty model 🤷♂️
curl 'http://0.0.0.0:3000/bedrock' --header 'Content-Type: application/json' --data ' {
"model": "",
"messages": [
{
"role": "user",
"content": "Tell me a story"
}
]
}
'
curl http://localhost:7869/api/chat \
-H "Content-Type: application/json" \
-d '{
"model": "llama3.1",
"messages": [
{"role": "user", "content": "Which LLM Model are you?"}
],
"stream": false
}'
curl http://localhost:3000/ollama/api/chat \
-H "Content-Type: application/json" \
-d '{
"model": "llama3.1",
"messages": [
{"role": "user", "content": "Which LLM Model are you?"}
],
"stream": false
}'
curl http://localhost:3000/ollama/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "llama3.1",
"messages": [
{"role": "user", "content": "Which LLM Model are you?"}
],
"stream": false
}'
Note
I did not yet succeed with MCP over agentgateway
so for debuggign I just used
inspector
(needs to be on host network mode) <-> everything
-server
Browser: http://localhost:6274/
(host networking mode, otherwise it doesn't work)
- http://localhost:3001/mcp (streamable)
- http://localhost:3001/sse (sse)
curl 'http://localhost:6277/mcp?url=http%3A%2F%2Flocalhost%3A3001%2Fmcp&transportType=streamable-http' \
-H 'accept: application/json, text/event-stream' \
-H 'content-type: application/json' \
--data-raw '{"method":"initialize","params":{"protocolVersion":"2025-06-18","capabilities":{"sampling":{},"elicitation":{},"roots":{"listChanged":true}},"clientInfo":{"name":"mcp-inspector","version":"0.16.5"}},"jsonrpc":"2.0","id":0}'
event: message
data: {"jsonrpc":"2.0","id":0,"result":{"protocolVersion":"2025-03-26","capabilities":{"prompts":{},"resources":{"subscribe":true},"tools":{},"logging":{},"completions":{}},"serverInfo":{"name":"example-servers/everything","version":"1.0.0"}}}
HTTP/1.1 200 OK
X-Powered-By: Express
Access-Control-Allow-Origin: *
Access-Control-Expose-Headers: mcp-session-id
Content-Type: text/event-stream
Cache-Control: no-cache
Connection: keep-alive
mcp-session-id: 5f130863-32e2-420a-a747-9fa654e774ef
Date: Thu, 04 Sep 2025 07:09:28 GMT
Transfer-Encoding: chunked
Request all MCP Tools from (custom)everything
(service: mcp-everything
)
mcp-session-id
you got returned from the initial request
curl 'http://localhost:6277/mcp?url=http%3A%2F%2Flocalhost%3A3001%2Fmcp&transportType=streamable-http' \
-H 'accept: application/json, text/event-stream' \
-H 'content-type: application/json' \
-H 'mcp-session-id: 5f130863-32e2-420a-a747-9fa654e774ef' \
--data-raw '{"method":"tools/list","params":{"_meta":{"progressToken":2}},"jsonrpc":"2.0","id":2}' ;