A comprehensive demonstration of the Model Context Protocol (MCP) featuring AWS Bedrock LLM as an intelligent agent that uses MCP servers as tools. Built for AWS re:Invent 2025 conference planning with real data and live API integrations.
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β MCP Client β
β βββββββββββββββββββ βββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β React Frontend β β Strands Agent β β
β β (Chat UI) βββββΊβ βββββββββββββββββββ βββββββββββββββββββββββ β β
β βββββββββββββββββββ β β Strands Agent β β Fallback MCP β β β
β β β β β Tool Execution β β β
β β βββββββββββββββββββ βββββββββββββββββββββββ β β
β βββββββββββββββββββββββββββββββββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββ΄ββββββββββββββ
β MCP Protocol β
β (JSON-RPC) β
βββββββββββββββ¬ββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββΌββββββββββββββββββββββββββββββββββββ
β β β
βββββββββΌβββββββββ ββββββββββΌβββββββββ βββββββββΌβββββββββ βββββββββΌβββββββββ
β re:Invent β β Weather Server β β Stock Server β β AWS Docs β
β MCP Server β β (Live APIs) β β (Live APIs) β β MCP Server β
β (Local) β β (Local) β β (Local) β β (Remote) β
β β β β β β β β
β Strands Tools: β β Strands Tools: β β Strands Tools: β β Strands Tools: β
β β’ searchSessionsβ β β’ getWeather β β β’ getStock β β β’ searchAWS β
β β’ recommend β β β’ getForecast β β β’ getMarket β β β’ getGuides β
β β’ getDetails β β β β β β β
β β β Resources: β β Resources: β β Resources: β
β Resources: β β β’ wttr.in β β β’ Yahoo Financeβ β β’ Official β
β β’ SQLite DB β β β’ OpenWeather β β β’ Alpha Vantageβ β AWS Docs β
β β’ 290 Sessions β β β β β β β’ uvx runtime β
ββββββββββββββββββ βββββββββββββββββββ ββββββββββββββββββ ββββββββββββββββββ
- AWS Bedrock Agent with Strands for intelligent tool orchestration
- Graceful Fallback to direct MCP tool execution when agent setup is complex
- Claude 3 Haiku as the reasoning engine
- Automatically selects appropriate MCP tools based on user queries
- Natural language understanding and contextual responses
- Multi-tool orchestration for complex requests
- 2254 actual AWS re:Invent 2025 sessions parsed from official conference catalog
- Full-text search across session titles, descriptions, and speakers
- Smart recommendations based on interests and experience level
- Complete session details with times, locations, and tracks
- Weather: Real-time data from wttr.in API with OpenWeatherMap fallback
- Stock Market: Live prices from Yahoo Finance API
- Travel: Mock flight data for Las Vegas routes (LAS airport)
- Tools: Functions that MCP servers expose to the LLM
- Resources: Data sources (databases, APIs, files)
- Prompt Templates: Structured prompts for consistent interactions
- Node.js 18+ and npm
- AWS Account with Bedrock access
- AWS CLI installed
- Python and uv package manager (for remote MCP servers)
# 1. Install dependencies
npm install
# 2. Install uv (Python package manager for remote MCP servers)
# macOS/Linux:
curl -LsSf https://astral.sh/uv/install.sh | sh
# Or via pip: pip install uv
# 3. Configure AWS credentials (recommended)
aws configure
# Enter your AWS Access Key ID, Secret Access Key, and region (us-east-1)
# 4. Optional: Create .env file for custom configuration
cp .env.example .env
# Edit .env with AWS_REGION or API keys if needed
# 5. Start the demo
chmod +x start-llm-demo.sh
./start-llm-demo.sh
# OR start manually:
npm start # Backend with strands agent
npm run start:demo # Both backend and frontend
# Note: The SQLite database with 2254 re:Invent sessions will be automatically
# created in servers/reinvent_schedule.db on first runOption 1: AWS CLI (Recommended)
aws configure
# AWS Access Key ID: your_key_here
# AWS Secret Access Key: your_secret_here
# Default region: us-east-1
# Default output format: jsonOption 2: Environment Variables
export AWS_ACCESS_KEY_ID=your_key_here
export AWS_SECRET_ACCESS_KEY=your_secret_here
export AWS_REGION=us-east-1Option 3: .env File (Optional)
cp .env.example .env
# Edit .env with AWS_REGION (credentials optional if using aws configure)
# Optionally add API keys for enhanced live data:
# WEATHER_API_KEY=your_openweathermap_key
# ALPHA_VANTAGE_API_KEY=your_alpha_vantage_keyTools:
search_sessions- Find sessions by keywords, topics, speakersget_session_details- Get complete session informationrecommend_sessions- AI-powered personalized recommendations
Resources:
- SQLite database with 2254 real AWS re:Invent sessions
- Full-text search capabilities
- Session metadata (tracks, levels, days, locations)
Tools:
get_weather- Current weather conditionsget_forecast- 3-day weather forecast
Resources:
- wttr.in API (primary, no key required)
- OpenWeatherMap API (fallback)
Tools:
get_stock_price- Real-time stock quotesget_market_status- Trading hours and market state
Resources:
- Yahoo Finance API (primary, no key required)
- Alpha Vantage API (fallback)
Tools:
search_flights- Find flights to Las Vegas for re:Invent
Resources:
- Mock flight database with realistic data
- Las Vegas (LAS) focused routes
Tools:
search_docs- Search AWS service documentation using official AWS MCP server- Powered by
uvx awslabs.aws-documentation-mcp-server@latest
Resources:
- Official AWS service documentation
- Real-time access to AWS docs via remote MCP server
- Comprehensive coverage of all AWS services
The LLM agent understands complex, conversational requests:
"I'm interested in AI and serverless. What re:Invent sessions would you recommend,
and what's the weather like in Las Vegas?"
The agent automatically:
- Calls
recommend_sessionswith interests: ["AI", "leadership"] - Calls
get_weatherwith location: "Las Vegas" - Combines results in a natural response
- Session Search: "Show me cybersecurity sessions on Day 2"
- Weather: "What's the weather forecast for Las Vegas?"
- Travel: "Find flights from San Francisco to Las Vegas"
- Recommendations: "Suggest sessions for a beginner in data science"
- AWS Documentation: "How do I use Bedrock with Lambda functions?"
- MCP Protocol: "What are MCP tools and how do they work?"
- Complex: "Plan my re:Invent trip: weather, flights from NYC, AI sessions, and AWS Bedrock documentation"
Automatic Initialization:
The SQLite database (servers/reinvent_schedule.db) is automatically created and populated when you first run the server. It contains:
- 2254 real AWS re:Invent 2025 sessions parsed from the official conference catalog
- Full session metadata: titles, speakers, descriptions, times, locations, tracks
- Searchable fields: All text fields are indexed for full-text search
Database Schema:
- sessions table: 2254 real AWS re:Invent sessions with complete metadata
- Indexed fields: title, description, speaker, tags, track, level, day
- Search capabilities: Full-text search with relevance scoring across all fields
Verify Database:
# Check if database exists and has data
sqlite3 servers/reinvent_schedule.db "SELECT COUNT(*) FROM sessions;"
# Should return: 2254
# View sample sessions
sqlite3 servers/reinvent_schedule.db "SELECT title, type, level FROM sessions LIMIT 3;"Database Recreation (if needed): If the database file is missing or corrupted, it will be automatically recreated with sample data when the server starts. The full 290-session dataset is embedded in the application and loaded automatically.
- Primary: AWS Bedrock Agent with strands architecture
- Fallback: Direct MCP tool execution with Claude 3 Haiku
- Tool Planning: Intelligent selection of appropriate MCP tools
- Parameter Extraction: Smart parsing of user queries
- Response Generation: Natural language formatting of results
- Graceful Degradation: Always functional regardless of AWS setup complexity
- LLM failure β Rule-based query processing
- API failures β Simulated data with clear indicators
- Graceful degradation ensures demo always works
See DEMO_SCRIPT.md for a complete 5-minute walkthrough including:
- Setup instructions
- Key talking points
- Expected responses
- Troubleshooting tips
- Architecture explanations
This demo showcases the future of AI applications: Strands agents using MCP as a standard protocol to intelligently orchestrate multiple data sources and services.
# Test the remote AWS MCP server directly
curl -X POST "http://localhost:3001/api/chat" \
-H "Content-Type: application/json" \
-d '{"message": "How do I get started with AWS Bedrock?"}'
# Expected: Real AWS documentation with official links# Test the local SQLite database
curl -X POST "http://localhost:3001/api/chat" \
-H "Content-Type: application/json" \
-d '{"message": "Show me AI sessions at re:Invent"}'
# Expected: Real conference sessions from 2254-session database# Test weather API
curl -X POST "http://localhost:3001/api/chat" \
-H "Content-Type: application/json" \
-d '{"message": "What is the weather in Seattle?"}'
# Expected: Live weather data from wttr.in# Test complex query using multiple MCP servers
curl -X POST "http://localhost:3001/api/chat" \
-H "Content-Type: application/json" \
-d '{"message": "Plan my re:Invent trip: weather in Las Vegas, AI sessions, and AWS Bedrock documentation"}'
# Expected: Combined results from weather, sessions, and AWS docs# Check overall system health including database
curl "http://localhost:3001/api/health"
# Look for:
# - "database": "connected"
# - "sessionsCount": 290
# - "llmAgent": "enabled"
# Check database info specifically
curl "http://localhost:3001/api/database/info"
# Should show:
# - totalSessions: 290
# - Available tracks, levels, and days- Open http://localhost:3000
- Click "MCP Architecture" tab to see strands architecture
- Test queries in chat interface:
- "How do I use Bedrock with Lambda?" (AWS docs)
- "Show me security sessions" (re:Invent data)
- "Weather in Las Vegas" (live data)
- AWS docs not working: Check uvx installation (
uvx --version) - Agent working normally: The system uses direct MCP tool orchestration
- No responses: Verify AWS credentials (
aws sts get-caller-identity)