Skip to content

AI based Graphical User Interface NPM package designed to streamline database management and accelerate development workflows for relational databases with an extension to non-relational databases in future

License

Notifications You must be signed in to change notification settings

djnadackal/dbfuse-ai

 
 

DBFuse AI

npm version Docker Pulls Image Size Docker Stars Known Vulnerabilities GitHub Stars Issues Pull Requests Contributors OpenSSF Scorecard

DBFuse AI is a simple web UI to connect to your databases, run SQL, and generate SQL with AI. It works with MySQL, PostgreSQL, SQL Server, Oracle, and SQLite.

Quick links:

Features

  • Connect to Local/Remote Databases
    Easily connect to databases on your local machine or remote servers.

  • CRUD Operations
    Perform Create, Read, Update, and Delete actions on databases and tables.

  • Multi-Tab Support
    Work across multiple databases or queries simultaneously, with a smooth and responsive interface.

  • Query Editor with Autocompletion
    Write queries faster with intelligent autocompletion and syntax highlighting.

  • Improved Pagination
    Navigate large datasets with optimized pagination.

  • Rich User Interface
    A dynamic and user-friendly interface designed to enhance productivity.

  • Dynamic Result Grid
    Visualize query results with a responsive, grid-based layout.

  • Basic Authentication
    Optional authentication for added security when running on remote servers.

  • Clipboard Copy
    Quickly copy cell data with a single click.

  • AI Integration
    Leverage OpenAI and Google Gemini for generating intelligent SQL queries. Talk to your selected Database

Prerequisites

  • Node.js 16.0.0 or above
    Install Node.js from the official downloads page. Alternatively, manage versions dynamically with tools like nvm on macOS, Linux, or WSL.

  • npm 8.0.0 or above
    npm is bundled with Node.js. To upgrade, use:

    npm i -g npm
  • A running database you can connect to (or a SQLite file)

Ways to run

Pick the option that fits your setup. All commands below assume a Bash-compatible shell (Windows users can use Git Bash).

  1. Global CLI (optional)
  • The command name is dbfuse-ai. Install it globally from npm, then run it with optional flags:

    npm install -g dbfuse-ai
    dbfuse-ai              # starts interactively and asks for options
    
    # or non-interactive with arguments
    dbfuse-ai -p 5000 --model gemini-2.5-flash --apikey <YOUR_API_KEY>

    Command-line options:

    • -p, --port <number>: Server port (default 5000)
    • --dbuser <username> and --dbpass <password>: Set Basic Auth credentials for the web UI
    • --model <name> and --apikey <key>: Enable AI with the selected model and API key
    • -v, --verbose: Show detailed prompts and info in the CLI

    Supported AI providers include: Gemini, OpenAI, Anthropic, Mistral, Cohere, Hugging Face, and Perplexity. Without --model and --apikey, the CLI will ask whether to enable AI and guide you interactively.

    Then open http://localhost:5000.

  1. Docker
  • Use the prebuilt image for a quick start. Create a docker-compose.yml like:

    version: "3.8"
    services:
      dbfuse-ai:
        container_name: dbfuse-ai
        image: shashikumarkasturi/dbfuse-ai:latest
        restart: unless-stopped
        ports:
          - "5000:5000"
        environment:
          - PORT=5000
          # Optional basic auth for UI (set both to enable)
          - DBFUSE_USERNAME=admin
          - DBFUSE_PASSWORD=admin
          # AI configuration (optional)
          - AI_PROVIDER=gemini
          - AI_MODEL=gemini-2.5-flash
          - AI_API_KEY=
        extra_hosts:
          - "host.docker.internal:host-gateway"
        tty: true

    Then run:

    docker compose up -d

    Open http://localhost:5000 and log in. To stop:

    docker compose down
  1. Docker (development, hot reload)
  • Run straight from your source tree with live reload using the provided docker-compose-dev.yml:

    docker compose -f docker-compose-dev.yml up

    Notes:

    • The container mounts your working folder and runs npm install && npm run start (nodemon) for the server.
    • It includes extra_hosts: host.docker.internal:host-gateway so the app can reach databases running on your host.
    • Leave AI_API_KEY empty in the YAML; export it locally instead of committing a real key.
    • You can copy .env.example to .env and customize values for local development.
  1. Local development (server + client)
  • Install dependencies at the repo root:

    npm install
  • Start both backend and frontend together (concurrently):

    npm run dev

    This runs:

  1. Local development (server only)
  • Build frontend assets once and serve them from the backend:

    cd client/dbfuse-ai-client
    npm install
    npm run clean-build-compress
    cd ../../
    npm run start

    The backend serves the built UI from src/public at http://localhost:5000.

AI Integration

DBFuse AI integrates OpenAI and Google Gemini to generate intelligent SQL queries with the following features:

Setting Up AI Integration

To enable AI-powered prompt querying, you need to set up the API keys for OpenAI or Google Gemini:

  1. Obtain an API Key
    • For OpenAI: Visit OpenAI's platform and generate an API key.
    • For Google Gemini: Follow Google's platform to obtain an API key (free tier provides 15 requests per minute).
  2. Add the API Key to Your Environment Variables In the root directory of your project, create a .env file (or set env vars in Docker):
AI_PROVIDER=gemini
AI_MODEL=gemini-2.5-flash
AI_API_KEY=<YOUR_API_KEY>
  1. Restart DBFuse Restart the application to activate AI integration.

Using AI Prompt Querying

  • Enable AI from the user interface using the AI toggle button.
  • Write a natural language query in the input box and click on AI Prompt button, AI will generate the corresponding SQL query.
  • The AI can join tables, generate aggregated queries, and suggest optimal SQL syntax based on the database you selected.

Example Workflow:

  1. Write a query prompt: Find the average salary of employees in each department.
  2. The AI generates the SQL:
SELECT Department, AVG(Salary) AS AvgSalary FROM employeerecords GROUP BY Department;

Supported AI Models

Below is the current list of model IDs you can use with --model (CLI), the Config UI, or via environment variables. Keep provider + model aligned (provider casing is normalized automatically).

Provider Models
Gemini gemini-2.5-flash, gemini-2.5-pro
OpenAI gpt-5, gpt-5-mini, gpt-5-nano, gpt-4.1, gpt-4o
Anthropic claude-opus-4-1, claude-opus-4, claude-sonnet-4, claude-3-7-sonnet, claude-3-5-haiku
Mistral mistral-medium-2508, mistral-large-2411, mistral-small-2407, codestral-2508
Cohere command-a-03-2025, command-a-reasoning-08-2025, command-a-vision-07-2025, command-r7b-12-2024
HuggingFace microsoft/DialoGPT-medium, facebook/blenderbot-400M-distill, microsoft/DialoGPT-large
Perplexity sonar, sonar-pro, sonar-reasoning, sonar-reasoning-pro, sonar-deep-research

Notes:

  1. HuggingFace models are examples; you can substitute any compatible chat/text model available to your account.
  2. Perplexity models use the OpenAI-compatible API surface; the app sets the correct base URL automatically.
  3. Additional models can be added by editing src/models/model.js (server) and cli.js (CLI list) — keep both in sync for best UX.
  4. If you only set AI_MODEL, the provider is inferred automatically (e.g. any claude-* → Anthropic).
  5. The generic AI_API_KEY is mirrored to provider-specific variables (OPENAI_API_KEY, GOOGLE_API_KEY, etc.) internally.

Security Note

  • Confidentiality: Never expose your .env file containing the API key.
  • Secure Connections: Ensure your MySQL connections and API keys are used securely, especially in production.

Environment Variables

  • PORT: Server port (default 5000)
  • DBFUSE_USERNAME / DBFUSE_PASSWORD: enable Basic Auth for the web UI (optional)
  • AI_PROVIDER, AI_MODEL, AI_API_KEY: AI settings (optional)
  • BODY_SIZE: request body size limit (default 50mb)
  • NODE_ENV: set to production in containers for best performance Note: Database connection details are entered via the UI; no DB URL environment variables are used by the server.

Tips:

  • When running inside Docker, use host.docker.internal to connect to databases on your host machine (we add extra_hosts for Linux compatibility).
  • Prefer exporting secrets (like AI_API_KEY) in your shell or using Docker secrets; avoid committing real keys to version control.
  • For a quick start, copy .env.example to .env and adjust values. The app reads .env automatically.

Supported Databases

  • MySQL
  • PostgreSQL
  • Microsoft SQL Server
  • Oracle Database
  • SQLite

CLI (optional)

If you installed globally with npm, you can start with:

dbfuse-ai -p 5000 --model gemini-2.5-flash --apikey <YOUR_API_KEY>

Then open http://localhost:5000.

Testing (optional)

Basic connectivity test suites are available:

npm run test:all       # run all DB connectivity tests (requires databases available)
npm run test:mysql     # MySQL
npm run test:postgres  # PostgreSQL
npm run test:mssql     # SQL Server
npm run test:oracle    # Oracle

These tests expect databases reachable at the configured defaults; adjust environment variables as needed.

Basic Authentication (Optional)

Protect data with basic authentication when running on remote servers.

  1. Create a .env file in the root directory.
  2. Add the following variables
DBFUSE_USERNAME=<your_username>
DBFUSE_PASSWORD=<your_password>
  1. Restart the server.

To disable authentication, remove these variables from .env and restart the server.

Upcoming Features

  • Additional databases: MariaDB, NoSQL and caches via strategy adapters (MongoDB, Redis)
  • SSH tunneling and client certificate auth for secure remote connections
  • Query history, saved connections/snippets, and export to CSV/JSON/Excel
  • Schema explorer improvements (indexes/constraints), and ER diagram view
  • AI: Explain/optimize queries and suggest indexes in addition to SQL generation
  • Charts and visual analysis for query results (line/bar/pie), with quick pivots
  • Pluggable driver/extension SDK to add new databases and tools
  • MCP servers implementation to connect to different databases

Contributions

DBFuse AI is open for contributions! If you have ideas for features, improvements, or bug fixes, feel free to submit a pull request or open an issue.

Contributors quickstart

  • Fork and clone the repository.
  • Install deps at repo root: npm install.
  • Dev server + client: npm run dev (backend on 5000, Angular dev server on 4200).
  • Server only with built UI: build via client/dbfuse-ai-clientnpm run clean-build-compress, then npm run start.
  • Configuration:
    • .env is hot-reloaded (most changes apply instantly). PORT change triggers an automatic restart.
    • You can override where .env lives by setting DBFUSE_CONFIG_DIR.
    • Prefer setting AI keys via your shell variables (do not commit real keys).

Adding a new AI model/provider

  1. Backend: Update src/models/model.js to include the new model ID and provider; wire the provider’s key name if needed.
  2. CLI: Add the model to supportedModels in cli.js so it appears in interactive selection.
  3. UI: Update the list in client/dbfuse-ai-client/src/app/lib/components/config/config.component.ts.
  4. Keep README’s Supported AI Models table in sync (or ask maintainers to update).

Adding a new database strategy

  1. Create a strategy in src/config/db_strategies/ (follow patterns from existing drivers).
  2. Register it in the connection manager.
  3. Add a minimal connectivity test under src/config/tests/.

Tests & formatting

  • Run all DB connectivity tests: npm run test:all (requires accessible databases).
  • Lint/format: npm run check (eslint + prettier) or npm run lint:fix / npm run format:fix.

Releases & publishing (maintainers)

  • Version bump in package.json, then create a git tag vX.Y.Z and push the tag.
  • CI builds the Angular client, publishes the npm package, builds/pushes Docker image, and syncs Docker Hub README.
  • Publishing is gated to tag builds; main runs build/verify only.

Demo

basic_working.mp4

License

DBFuse AI is distributed under the MIT License. This license permits commercial use, modification, distribution, and private use, with the requirement to include the original copyright and license notice.

About

AI based Graphical User Interface NPM package designed to streamline database management and accelerate development workflows for relational databases with an extension to non-relational databases in future

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 45.1%
  • TypeScript 28.8%
  • HTML 24.7%
  • Other 1.4%