Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,7 @@ venv/
ENV/
config.local.js
*.local.json
package-lock.json

# Logs
logs
Expand Down
93 changes: 80 additions & 13 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,13 +8,15 @@ DBHub is a Universal Database Gateway implementing the Model Context Protocol (M

## Commands

- Build: `pnpm run build` - Compiles TypeScript to JavaScript using tsup
- Start: `pnpm run start` - Runs the compiled server
- Dev: `pnpm run dev` - Runs server with tsx (no compilation needed)
- Test: `pnpm test` - Run all tests
- Test Watch: `pnpm test:watch` - Run tests in watch mode
- Integration Tests: `pnpm test:integration` - Run database integration tests (requires Docker)
- Pre-commit: `./scripts/setup-husky.sh` - Setup git hooks for automated testing
- **Build**: `pnpm run build` - Compiles TypeScript to JavaScript using tsup
- **Start**: `pnpm run start` - Runs the compiled server
- **Dev**: `pnpm run dev` - Runs server with tsx (no compilation needed)
- **Cross-platform Dev**: `pnpm run crossdev` - Cross-platform development with tsx
- **Test**: `pnpm test` - Run all tests with Vitest
- **Test Watch**: `pnpm test:watch` - Run tests in watch mode
- **Integration Tests**: `pnpm test:integration` - Run database integration tests (requires Docker)
- **Pre-commit**: `./scripts/setup-husky.sh` - Setup git hooks for automated testing
- **Pre-commit Hook**: `pnpm run pre-commit` - Run lint-staged checks

## Architecture Overview

Expand Down Expand Up @@ -49,6 +51,7 @@ Key architectural patterns:
- **Connector Registry**: Dynamic registration system for database connectors
- **Transport Abstraction**: Support for both stdio (desktop tools) and HTTP (network clients)
- **Resource/Tool/Prompt Handlers**: Clean separation of MCP protocol concerns
- **Multi-Database Support**: Simultaneous connections to multiple databases with isolated contexts
- **Integration Test Base**: Shared test utilities for consistent connector testing

## Environment
Expand All @@ -63,6 +66,43 @@ Key architectural patterns:
- Demo mode: Use `--demo` flag for bundled SQLite employee database
- Read-only mode: Use `--readonly` flag to restrict to read-only SQL operations

## Multi-Database Support

DBHub supports connecting to multiple databases simultaneously:

### Configuration
- **Single Database**: Use `DSN` environment variable or `--dsn` command line argument
- **Multiple Databases**: Use `DSN_dev`, `DSN_test`, etc. environment variables

### Usage Examples

```bash
# Single database (backward compatible)
export DSN="postgres://user:pass@localhost:5432/mydb"

# Multiple databases
export DSN_dev="postgres://user:pass@localhost:5432/db1"
export DSN_test="mysql://user:pass@localhost:3306/db2"
export DSN_prod="sqlite:///path/to/database.db"
```

### HTTP Transport Endpoints
When using HTTP transport (`--transport=http`), multiple endpoints are available:

- `http://localhost:8080/message` - Default database (first configured)
- `http://localhost:8080/message/{databaseId}` - Specific database (e.g., `http://localhost:8080/message/db1`)

### STDIO Transport
- STDIO transport uses the default database
- Available databases are listed in startup messages
- Use HTTP transport for full multi-database access

### Database Context
All MCP tools, resources, and prompts support database-specific operations:
- Tools: `execute_sql_{databaseId}`
- Resources: Database-specific schema exploration
- Prompts: `generate_sql_{databaseId}`, `explain_db_{databaseId}`

## Database Connectors

- Add new connectors in `src/connectors/{db-type}/index.ts`
Expand All @@ -78,12 +118,19 @@ Key architectural patterns:

## Testing Approach

- Unit tests for individual components and utilities
- Integration tests using Testcontainers for real database testing
- All connectors have comprehensive integration test coverage
- Pre-commit hooks run related tests automatically
- Test specific databases: `pnpm test src/connectors/__tests__/{db-type}.integration.test.ts`
- SSH tunnel tests: `pnpm test postgres-ssh-simple.integration.test.ts`
- **Unit Tests**: Individual components and utilities using Vitest
- **Integration Tests**: Real database testing using Testcontainers with Docker
- **Test Coverage**: All connectors have comprehensive integration test coverage
- **Pre-commit Hooks**: Automatic test execution via lint-staged
- **Test Specific Databases**:
- PostgreSQL: `pnpm test src/connectors/__tests__/postgres.integration.test.ts`
- MySQL: `pnpm test src/connectors/__tests__/mysql.integration.test.ts`
- MariaDB: `pnpm test src/connectors/__tests__/mariadb.integration.test.ts`
- SQL Server: `pnpm test src/connectors/__tests__/sqlserver.integration.test.ts`
- SQLite: `pnpm test src/connectors/__tests__/sqlite.integration.test.ts`
- SSH Tunnel: `pnpm test src/connectors/__tests__/postgres-ssh.integration.test.ts`
- JSON RPC: `pnpm test src/__tests__/json-rpc-integration.test.ts`
- **Test Utilities**: Shared integration test base in `src/connectors/__tests__/shared/integration-test-base.ts`

## SSH Tunnel Support

Expand All @@ -99,6 +146,26 @@ DBHub supports SSH tunnels for secure database connections through bastion hosts
- Default SSH key detection (tries `~/.ssh/id_rsa`, `~/.ssh/id_ed25519`, etc.)
- Tunnel lifecycle managed by `ConnectorManager`

## Development Environment

- **TypeScript**: Strict mode enabled with ES2020 target
- **Module System**: ES modules with `.js` extension in imports
- **Package Manager**: pnpm for dependency management
- **Build Tool**: tsup for TypeScript compilation
- **Test Framework**: Vitest for unit and integration testing
- **Development Runtime**: tsx for development without compilation

## Key Architectural Patterns

- **Connector Registry**: Dynamic registration system for database connectors with automatic DSN detection
- **Transport Abstraction**: Support for both stdio (desktop tools) and HTTP (network clients) with CORS protection
- **Resource/Tool/Prompt Handlers**: Clean separation of MCP protocol concerns
- **Multi-Database Management**: Simultaneous connections to multiple databases with database ID-based routing
- **Database Context Propagation**: Consistent database ID flow through all MCP handlers
- **SSH Tunnel Integration**: Automatic tunnel establishment when SSH config detected
- **Singleton Manager**: `ConnectorManager` provides unified interface across all database operations
- **Integration Test Base**: Shared test utilities for consistent connector testing

## Code Style

- TypeScript with strict mode enabled
Expand Down
29 changes: 28 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
> [!NOTE]
> [!NOTE]
> Brought to you by [Bytebase](https://www.bytebase.com/), open-source database DevSecOps platform.

<p align="center">
Expand Down Expand Up @@ -118,6 +118,33 @@ dbhub:
- database
```

**Multiple Database Support:**

DBHub supports connecting to multiple databases simultaneously using .env file with volume mount:

```bash
# Create .env file with multiple database configurations
cat > .env << EOF
DSN_dev=postgres://user:password@localhost:5432/db1
DSN_test=mysql://user:password@localhost:3306/db2
DSN_prod=sqlite:///path/to/database.db
EOF

# Run container with .env file mounted
docker run --rm --init \
--name dbhub \
--publish 8080:8080 \
--volume "$(pwd)/.env:/app/.env" \
bytebase/dbhub \
--transport http \
--port 8080
```

Available endpoints when using multiple databases:

- `http://localhost:8080/message` - Default database (first configured)
- `http://localhost:8080/message/{databaseId}` - Specific database (e.g., `http://localhost:8080/message/dev`, `http://localhost:8080/message/test`)

### NPM

```bash
Expand Down
99 changes: 99 additions & 0 deletions src/config/env.ts
Original file line number Diff line number Diff line change
Expand Up @@ -235,6 +235,105 @@ export function resolveDSN(): { dsn: string; source: string; isDemo?: boolean }
return null;
}

/**
* Resolve multiple DSN configurations from environment variables
* Supports both single DSN (backward compatible) and multiple DSN_* formats
* Returns a map of database IDs to DSN strings
*/
export function resolveMultiDSN(): Map<string, { dsn: string; source: string; isDemo?: boolean }> {
const multiDSN = new Map<string, { dsn: string; source: string; isDemo?: boolean }>();

// Get command line arguments
const args = parseCommandLineArgs();

// Check for demo mode first (highest priority)
if (isDemoMode()) {
multiDSN.set("default", {
dsn: "sqlite:///:memory:",
source: "demo mode",
isDemo: true,
});
return multiDSN;
}

// 1. Check command line arguments for single DSN
if (args.dsn) {
multiDSN.set("default", { dsn: args.dsn, source: "command line argument" });
return multiDSN;
}

// 2. Check for multiple DSN configurations from environment variables
const dsnPattern = /^DSN_(\w+)$/;
let foundMultiDSN = false;

// Check environment variables before loading .env
for (const [key, value] of Object.entries(process.env)) {
if (key === "DSN" && value) {
// Single DSN format (backward compatible)
multiDSN.set("default", { dsn: value, source: "environment variable" });
} else if (dsnPattern.test(key) && value) {
// Multiple DSN format: DSN_dev=postgres://...
const match = key.match(dsnPattern);
if (match) {
const id = match[1];
multiDSN.set(id, { dsn: value, source: "environment variable" });
foundMultiDSN = true;
}
}
}

// 3. If no multi-DSN found, check for individual DB parameters
if (multiDSN.size === 0) {
const envParamsResult = buildDSNFromEnvParams();
if (envParamsResult) {
multiDSN.set("default", envParamsResult);
}
}

// 4. Try loading from .env files if no DSNs found yet
if (multiDSN.size === 0) {
const loadedEnvFile = loadEnvFiles();

if (loadedEnvFile) {
// Check for single DSN in .env file
if (process.env.DSN) {
multiDSN.set("default", { dsn: process.env.DSN, source: `${loadedEnvFile} file` });
}

// Check for multiple DSN configurations in .env file
for (const [key, value] of Object.entries(process.env)) {
if (dsnPattern.test(key) && value) {
const match = key.match(dsnPattern);
if (match) {
const id = match[1];
multiDSN.set(id, { dsn: value, source: `${loadedEnvFile} file` });
foundMultiDSN = true;
}
}
}

// Check for individual DB parameters from .env file
if (multiDSN.size === 0) {
const envFileParamsResult = buildDSNFromEnvParams();
if (envFileParamsResult) {
multiDSN.set("default", {
dsn: envFileParamsResult.dsn,
source: `${loadedEnvFile} file (individual parameters)`
});
}
}
}
}

// If we found multiple DSN configurations but no default, use the first one as default
if (foundMultiDSN && !multiDSN.has("default") && multiDSN.size > 0) {
const firstEntry = Array.from(multiDSN.entries())[0];
multiDSN.set("default", { ...firstEntry[1], source: `${firstEntry[1].source} (as default)` });
}

return multiDSN;
}

/**
* Resolve transport type from command line args or environment variables
* Returns 'stdio' or 'http' (streamable HTTP), with 'stdio' as the default
Expand Down
18 changes: 18 additions & 0 deletions src/connectors/interface.ts
Original file line number Diff line number Diff line change
Expand Up @@ -149,6 +149,24 @@ export interface Connector {
executeSQL(sql: string, options: ExecuteOptions): Promise<SQLResult>;
}

/**
* Database connection configuration
*/
export interface DatabaseConnection {
/** Unique identifier for this database connection */
id: string;
/** Database connector instance */
connector: Connector;
/** Database connection string */
dsn: string;
/** Source of the DSN configuration */
source: string;
/** Whether this is a demo database */
isDemo?: boolean;
/** SSH tunnel configuration if applicable */
sshConfig?: any;
}

/**
* Registry for available database connectors
*/
Expand Down
Loading