AI debugging agent backend for OpenBug CLI
OpenBug Server powers intelligent debugging for running applications. It connects to the CLI's local cluster via WebSocket, processes debugging queries using AI, and coordinates tool execution for log analysis and code search.
🚧 Beta - OpenBug is actively developed and maintained. We ship updates regularly and welcome feedback.
Try the hosted version:
Install the OpenBug CLI, which connects to our hosted server by default:
npm install -g @openbug/cli
debugRun your own server:
git clone https://github.com/openbug-ai/server.git
cd server
npm install
# Create .env file (see Configuration section)
cp .env.example .env
npm run devServer runs on http://localhost:3000. Point the CLI to your server:
export WEB_SOCKET_URL=ws://localhost:3000/v2/ws
export API_BASE_URL=http://localhost:3000/v2/api┌─────────────────────────────────────────────────────┐
│ CLI Local Cluster │
│ • Captures logs from services │
│ • Executes tool calls (read_file, grep, etc.) │
└──────────────────────┬──────────────────────────────┘
│
↕ WebSocket
│
┌─────────────▼───────────────┐
│ │
│ OpenBug AI Server │
│ (Fastify + ai-sdk) │
│ │
│ ┌───────────────────┐ │
│ │ Agent Graph │ │
│ └───────────────────┘ │
│ │
│ ┌───────────────────┐ │
│ │ Redis │ │
│ │ (Coordination) │ │
│ └───────────────────┘ │
│ │
└─────────────┬───────────────┘
│
↕
┌─────────────▼───────────────┐
│ OpenAI API │
└─────────────────────────────┘
How it works:
- CLI local cluster connects via WebSocket
- Server receives debugging query
- Server sends tool call requests (read files, search logs, grep code)
- CLI executes tools locally and returns results via Redis
- Server processes results with OpenAI using AI SDK
- Response streams back to CLI through WebSocket
- AI-powered analysis - Uses Vercel AI SDK with OpenAI for debugging assistance
- Tool coordination via Redis - Manages async tool execution between server and CLI
- WebSocket-based debugging - Real-time bidirectional communication with CLI
Create a .env file:
# Node Environment
NODE_ENV=development
# Server
PORT=3000
# Redis (required for tool coordination)
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_USERNAME=
REDIS_PASSWORD=
# OpenAI (required for AI debugging)
OPENAI_API_KEY=sk-...
# CLI Version Check
MINIMUM_CLI_VERSION=1.0.12Redis - Coordinates tool execution between server and CLI. Install locally:
# macOS
brew install redis
brew services start redis
# Linux
sudo apt-get install redis-server
sudo systemctl start redis
# Docker
docker run -d -p 6379:6379 redisOpenAI API Key - Get one at platform.openai.com
Swagger UI:
http://localhost:3000/v2/api/docs
WebSocket Endpoint:
ws://localhost:3000/v2/ws
The CLI connects here to establish debugging sessions.
HTTP Endpoints:
GET /v2/api/health- Health checkPOST /v2/api/graph- Execute agent graph (for testing)POST /v2/api/tool- Tool execution endpoint
Start with hot reload:
npm run devBuild for production:
npm run build # Compiles to dist/ using SWC
npm start # Runs from build/Available scripts:
npm run dev- Development server with hot reloadnpm run build- Production build (SWC)npm run build:tsc- TypeScript compilation to build/npm start- Start production servernpm run lint- Run ESLintnpm run lint:fix- Auto-fix linting errors
server/
├── src/
│ ├── apis/ # HTTP route handlers
│ │ ├── graph.ts # Agent graph endpoint
│ │ ├── health.ts # Health check
│ │ └── tool.ts # Tool execution
│ ├── config/ # App configuration & Swagger
│ ├── errors/ # Error definitions
│ ├── exception/ # Exception handling
│ ├── interface/ # TypeScript interfaces
│ ├── plugins/ # Fastify plugins (WebSocket, etc.)
│ ├── services/ # Business logic
│ │ ├── v2/ # Agent graph implementation
│ │ └── tools/ # Tool execution services
│ ├── utils/ # Redis, socket manager, helpers
│ ├── app.ts # Fastify app setup
│ └── server.ts # Entry point
└── package.json
Code stays on your machine
The server requests file access through tool calls executed by the CLI. Only specific files or snippets queried by the AI are sent back to the server.
No built-in authentication
This open-source server doesn't include auth. The hosted service handles API key authentication. For production deployments, add your own authentication layer.
CORS Configuration
Configure allowed origins in src/app.ts:
fastify.register(cors, {
origin: ['http://localhost:3000'],
credentials: true
})Environment isolation
- OpenAI keys stay on the server
- CLI users never see or need OpenAI credentials
- Redis used only for ephemeral tool coordination
Environment Variables:
Set all required variables from the Configuration section.
Production checklist:
- Set
NODE_ENV=production - Use managed Redis (e.g., Redis Cloud, AWS ElastiCache)
- Add authentication middleware
- Configure CORS for your domains
- Set up monitoring and logging
- Use process manager (PM2, systemd)
Docker:
# Build
docker build -t openbug-server .
# Run
docker run -p 3000:3000 \
-e OPENAI_API_KEY=sk-... \
-e REDIS_HOST=redis \
openbug-server- Node.js 18+
- Redis 6+
- OpenAI API key
Contributions welcome! Please feel free to submit a Pull Request.
The codebase is designed to be hackable:
- Core services in
src/services/v2/ - Tool coordination in
src/utils/redis.ts - WebSocket handling in
src/plugins/websockets.ts
MIT