Skip to content

OpenBug CLI is an intelligent command-line tool that helps you debug your applications in real-time using AI assistance. Run your commands, view logs, and get AI-powered insights all in one interactive terminal interface.

License

Notifications You must be signed in to change notification settings

openbug-ai/server

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

OpenBug Server

OpenBug Logo

AI debugging agent backend for OpenBug CLI

License: MIT

OpenBug Server powers intelligent debugging for running applications. It connects to the CLI's local cluster via WebSocket, processes debugging queries using AI, and coordinates tool execution for log analysis and code search.


Status

🚧 Beta - OpenBug is actively developed and maintained. We ship updates regularly and welcome feedback.


Quick Start

Try the hosted version:

Install the OpenBug CLI, which connects to our hosted server by default:

npm install -g @openbug/cli
debug

Run your own server:

git clone https://github.com/openbug-ai/server.git
cd server
npm install

# Create .env file (see Configuration section)
cp .env.example .env

npm run dev

Server runs on http://localhost:3000. Point the CLI to your server:

export WEB_SOCKET_URL=ws://localhost:3000/v2/ws
export API_BASE_URL=http://localhost:3000/v2/api

Architecture

┌─────────────────────────────────────────────────────┐
│  CLI Local Cluster                                  │
│  • Captures logs from services                      │
│  • Executes tool calls (read_file, grep, etc.)      │
└──────────────────────┬──────────────────────────────┘
                       │
                       ↕ WebSocket
                       │
         ┌─────────────▼───────────────┐
         │                             │
         │     OpenBug AI Server       │
         │     (Fastify + ai-sdk)      │
         │                             │
         │   ┌───────────────────┐     │
         │   │   Agent Graph     │     │
         │   └───────────────────┘     │
         │                             │
         │   ┌───────────────────┐     │
         │   │   Redis           │     │
         │   │   (Coordination)  │     │
         │   └───────────────────┘     │
         │                             │
         └─────────────┬───────────────┘
                       │
                       ↕
         ┌─────────────▼───────────────┐
         │         OpenAI API          │
         └─────────────────────────────┘

How it works:

  1. CLI local cluster connects via WebSocket
  2. Server receives debugging query
  3. Server sends tool call requests (read files, search logs, grep code)
  4. CLI executes tools locally and returns results via Redis
  5. Server processes results with OpenAI using AI SDK
  6. Response streams back to CLI through WebSocket

Core Features

  • AI-powered analysis - Uses Vercel AI SDK with OpenAI for debugging assistance
  • Tool coordination via Redis - Manages async tool execution between server and CLI
  • WebSocket-based debugging - Real-time bidirectional communication with CLI

Configuration

Create a .env file:

# Node Environment
NODE_ENV=development

# Server
PORT=3000

# Redis (required for tool coordination)
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_USERNAME=
REDIS_PASSWORD=

# OpenAI (required for AI debugging)
OPENAI_API_KEY=sk-...

# CLI Version Check
MINIMUM_CLI_VERSION=1.0.12

Required Services

Redis - Coordinates tool execution between server and CLI. Install locally:

# macOS
brew install redis
brew services start redis

# Linux
sudo apt-get install redis-server
sudo systemctl start redis

# Docker
docker run -d -p 6379:6379 redis

OpenAI API Key - Get one at platform.openai.com


API Documentation

Swagger UI:

http://localhost:3000/v2/api/docs

WebSocket Endpoint:

ws://localhost:3000/v2/ws

The CLI connects here to establish debugging sessions.

HTTP Endpoints:

  • GET /v2/api/health - Health check
  • POST /v2/api/graph - Execute agent graph (for testing)
  • POST /v2/api/tool - Tool execution endpoint

Development

Start with hot reload:

npm run dev

Build for production:

npm run build    # Compiles to dist/ using SWC
npm start        # Runs from build/

Available scripts:

  • npm run dev - Development server with hot reload
  • npm run build - Production build (SWC)
  • npm run build:tsc - TypeScript compilation to build/
  • npm start - Start production server
  • npm run lint - Run ESLint
  • npm run lint:fix - Auto-fix linting errors

Project Structure

server/
├── src/
│   ├── apis/           # HTTP route handlers
│   │   ├── graph.ts    # Agent graph endpoint
│   │   ├── health.ts   # Health check
│   │   └── tool.ts     # Tool execution
│   ├── config/         # App configuration & Swagger
│   ├── errors/         # Error definitions
│   ├── exception/      # Exception handling
│   ├── interface/      # TypeScript interfaces
│   ├── plugins/        # Fastify plugins (WebSocket, etc.)
│   ├── services/       # Business logic
│   │   ├── v2/         # Agent graph implementation
│   │   └── tools/      # Tool execution services
│   ├── utils/          # Redis, socket manager, helpers
│   ├── app.ts          # Fastify app setup
│   └── server.ts       # Entry point
└── package.json

Privacy & Security

Code stays on your machine

The server requests file access through tool calls executed by the CLI. Only specific files or snippets queried by the AI are sent back to the server.

No built-in authentication

This open-source server doesn't include auth. The hosted service handles API key authentication. For production deployments, add your own authentication layer.

CORS Configuration

Configure allowed origins in src/app.ts:

fastify.register(cors, {
  origin: ['http://localhost:3000'],
  credentials: true
})

Environment isolation

  • OpenAI keys stay on the server
  • CLI users never see or need OpenAI credentials
  • Redis used only for ephemeral tool coordination

Deployment

Environment Variables:

Set all required variables from the Configuration section.

Production checklist:

  • Set NODE_ENV=production
  • Use managed Redis (e.g., Redis Cloud, AWS ElastiCache)
  • Add authentication middleware
  • Configure CORS for your domains
  • Set up monitoring and logging
  • Use process manager (PM2, systemd)

Docker:

# Build
docker build -t openbug-server .

# Run
docker run -p 3000:3000 \
  -e OPENAI_API_KEY=sk-... \
  -e REDIS_HOST=redis \
  openbug-server

Requirements

  • Node.js 18+
  • Redis 6+
  • OpenAI API key

Contributing

Contributions welcome! Please feel free to submit a Pull Request.

The codebase is designed to be hackable:

  • Core services in src/services/v2/
  • Tool coordination in src/utils/redis.ts
  • WebSocket handling in src/plugins/websockets.ts

Support


License

MIT

About

OpenBug CLI is an intelligent command-line tool that helps you debug your applications in real-time using AI assistance. Run your commands, view logs, and get AI-powered insights all in one interactive terminal interface.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •