DBFuse AI is a simple web UI to connect to your databases, run SQL, and generate SQL with AI. It works with MySQL, PostgreSQL, SQL Server, Oracle, and SQLite.
Quick links:
- NPM: https://www.npmjs.com/package/dbfuse-ai
- Docker Hub: https://hub.docker.com/r/shashikumarkasturi/dbfuse-ai
- Issues: https://github.com/kshashikumar/dbfuse-ai/issues
- Pull Requests: https://github.com/kshashikumar/dbfuse-ai/pulls
-
Connect to Local/Remote Databases
Easily connect to databases on your local machine or remote servers. -
CRUD Operations
Perform Create, Read, Update, and Delete actions on databases and tables. -
Multi-Tab Support
Work across multiple databases or queries simultaneously, with a smooth and responsive interface. -
Query Editor with Autocompletion
Write queries faster with intelligent autocompletion and syntax highlighting. -
Improved Pagination
Navigate large datasets with optimized pagination. -
Rich User Interface
A dynamic and user-friendly interface designed to enhance productivity. -
Dynamic Result Grid
Visualize query results with a responsive, grid-based layout. -
Basic Authentication
Optional authentication for added security when running on remote servers. -
Clipboard Copy
Quickly copy cell data with a single click. -
AI Integration
Leverage OpenAI and Google Gemini for generating intelligent SQL queries. Talk to your selected Database
-
Node.js 16.0.0 or above
Install Node.js from the official downloads page. Alternatively, manage versions dynamically with tools like nvm on macOS, Linux, or WSL. -
npm 8.0.0 or above
npm is bundled with Node.js. To upgrade, use:npm i -g npm
-
A running database you can connect to (or a SQLite file)
Pick the option that fits your setup. All commands below assume a Bash-compatible shell (Windows users can use Git Bash).
- Global CLI (optional)
-
The command name is
dbfuse-ai. Install it globally from npm, then run it with optional flags:npm install -g dbfuse-ai dbfuse-ai # starts interactively and asks for options # or non-interactive with arguments dbfuse-ai -p 5000 --model gemini-2.5-flash --apikey <YOUR_API_KEY>
Command-line options:
-p, --port <number>: Server port (default 5000)--dbuser <username>and--dbpass <password>: Set Basic Auth credentials for the web UI--model <name>and--apikey <key>: Enable AI with the selected model and API key-v, --verbose: Show detailed prompts and info in the CLI
Supported AI providers include: Gemini, OpenAI, Anthropic, Mistral, Cohere, Hugging Face, and Perplexity. Without
--modeland--apikey, the CLI will ask whether to enable AI and guide you interactively.Then open http://localhost:5000.
- Docker
-
Use the prebuilt image for a quick start. Create a
docker-compose.ymllike:version: "3.8" services: dbfuse-ai: container_name: dbfuse-ai image: shashikumarkasturi/dbfuse-ai:latest restart: unless-stopped ports: - "5000:5000" environment: - PORT=5000 # Optional basic auth for UI (set both to enable) - DBFUSE_USERNAME=admin - DBFUSE_PASSWORD=admin # AI configuration (optional) - AI_PROVIDER=gemini - AI_MODEL=gemini-2.5-flash - AI_API_KEY= extra_hosts: - "host.docker.internal:host-gateway" tty: true
Then run:
docker compose up -d
Open http://localhost:5000 and log in. To stop:
docker compose down
- Docker (development, hot reload)
-
Run straight from your source tree with live reload using the provided
docker-compose-dev.yml:docker compose -f docker-compose-dev.yml up
Notes:
- The container mounts your working folder and runs
npm install && npm run start(nodemon) for the server. - It includes
extra_hosts: host.docker.internal:host-gatewayso the app can reach databases running on your host. - Leave
AI_API_KEYempty in the YAML; export it locally instead of committing a real key. - You can copy
.env.exampleto.envand customize values for local development.
- The container mounts your working folder and runs
- Local development (server + client)
-
Install dependencies at the repo root:
npm install
-
Start both backend and frontend together (concurrently):
npm run dev
This runs:
- Backend (Express) with hot reload at http://localhost:5000
- Frontend (Angular dev server) at http://localhost:4200
- Local development (server only)
-
Build frontend assets once and serve them from the backend:
cd client/dbfuse-ai-client npm install npm run clean-build-compress cd ../../ npm run start
The backend serves the built UI from
src/publicat http://localhost:5000.
DBFuse AI integrates OpenAI and Google Gemini to generate intelligent SQL queries with the following features:
To enable AI-powered prompt querying, you need to set up the API keys for OpenAI or Google Gemini:
- Obtain an API Key
- For OpenAI: Visit OpenAI's platform and generate an API key.
- For Google Gemini: Follow Google's platform to obtain an API key (free tier provides 15 requests per minute).
- Add the API Key to Your Environment Variables In the root directory of your project, create a .env file (or set env vars in Docker):
AI_PROVIDER=gemini
AI_MODEL=gemini-2.5-flash
AI_API_KEY=<YOUR_API_KEY>- Restart DBFuse Restart the application to activate AI integration.
- Enable AI from the user interface using the AI toggle button.
- Write a natural language query in the input box and click on AI Prompt button, AI will generate the corresponding SQL query.
- The AI can join tables, generate aggregated queries, and suggest optimal SQL syntax based on the database you selected.
Example Workflow:
- Write a query prompt:
Find the average salary of employees in each department. - The AI generates the SQL:
SELECT Department, AVG(Salary) AS AvgSalary FROM employeerecords GROUP BY Department;Below is the current list of model IDs you can use with --model (CLI), the Config UI, or via environment variables. Keep provider + model aligned (provider casing is normalized automatically).
| Provider | Models |
|---|---|
| Gemini | gemini-2.5-flash, gemini-2.5-pro |
| OpenAI | gpt-5, gpt-5-mini, gpt-5-nano, gpt-4.1, gpt-4o |
| Anthropic | claude-opus-4-1, claude-opus-4, claude-sonnet-4, claude-3-7-sonnet, claude-3-5-haiku |
| Mistral | mistral-medium-2508, mistral-large-2411, mistral-small-2407, codestral-2508 |
| Cohere | command-a-03-2025, command-a-reasoning-08-2025, command-a-vision-07-2025, command-r7b-12-2024 |
| HuggingFace | microsoft/DialoGPT-medium, facebook/blenderbot-400M-distill, microsoft/DialoGPT-large |
| Perplexity | sonar, sonar-pro, sonar-reasoning, sonar-reasoning-pro, sonar-deep-research |
Notes:
- HuggingFace models are examples; you can substitute any compatible chat/text model available to your account.
- Perplexity models use the OpenAI-compatible API surface; the app sets the correct base URL automatically.
- Additional models can be added by editing
src/models/model.js(server) andcli.js(CLI list) — keep both in sync for best UX. - If you only set
AI_MODEL, the provider is inferred automatically (e.g. anyclaude-*→ Anthropic). - The generic
AI_API_KEYis mirrored to provider-specific variables (OPENAI_API_KEY,GOOGLE_API_KEY, etc.) internally.
- Confidentiality: Never expose your .env file containing the API key.
- Secure Connections: Ensure your MySQL connections and API keys are used securely, especially in production.
- PORT: Server port (default 5000)
- DBFUSE_USERNAME / DBFUSE_PASSWORD: enable Basic Auth for the web UI (optional)
- AI_PROVIDER, AI_MODEL, AI_API_KEY: AI settings (optional)
- BODY_SIZE: request body size limit (default 50mb)
- NODE_ENV: set to
productionin containers for best performance Note: Database connection details are entered via the UI; no DB URL environment variables are used by the server.
Tips:
- When running inside Docker, use
host.docker.internalto connect to databases on your host machine (we addextra_hostsfor Linux compatibility). - Prefer exporting secrets (like AI_API_KEY) in your shell or using Docker secrets; avoid committing real keys to version control.
- For a quick start, copy
.env.exampleto.envand adjust values. The app reads.envautomatically.
- MySQL
- PostgreSQL
- Microsoft SQL Server
- Oracle Database
- SQLite
If you installed globally with npm, you can start with:
dbfuse-ai -p 5000 --model gemini-2.5-flash --apikey <YOUR_API_KEY>Then open http://localhost:5000.
Basic connectivity test suites are available:
npm run test:all # run all DB connectivity tests (requires databases available)
npm run test:mysql # MySQL
npm run test:postgres # PostgreSQL
npm run test:mssql # SQL Server
npm run test:oracle # OracleThese tests expect databases reachable at the configured defaults; adjust environment variables as needed.
Protect data with basic authentication when running on remote servers.
- Create a
.envfile in the root directory. - Add the following variables
DBFUSE_USERNAME=<your_username>
DBFUSE_PASSWORD=<your_password>- Restart the server.
To disable authentication, remove these variables from .env and restart the server.
- Additional databases: MariaDB, NoSQL and caches via strategy adapters (MongoDB, Redis)
- SSH tunneling and client certificate auth for secure remote connections
- Query history, saved connections/snippets, and export to CSV/JSON/Excel
- Schema explorer improvements (indexes/constraints), and ER diagram view
- AI: Explain/optimize queries and suggest indexes in addition to SQL generation
- Charts and visual analysis for query results (line/bar/pie), with quick pivots
- Pluggable driver/extension SDK to add new databases and tools
- MCP servers implementation to connect to different databases
DBFuse AI is open for contributions! If you have ideas for features, improvements, or bug fixes, feel free to submit a pull request or open an issue.
- Fork and clone the repository.
- Install deps at repo root:
npm install. - Dev server + client:
npm run dev(backend on 5000, Angular dev server on 4200). - Server only with built UI: build via
client/dbfuse-ai-client→npm run clean-build-compress, thennpm run start. - Configuration:
.envis hot-reloaded (most changes apply instantly). PORT change triggers an automatic restart.- You can override where
.envlives by settingDBFUSE_CONFIG_DIR. - Prefer setting AI keys via your shell variables (do not commit real keys).
- Backend: Update
src/models/model.jsto include the new model ID and provider; wire the provider’s key name if needed. - CLI: Add the model to
supportedModelsincli.jsso it appears in interactive selection. - UI: Update the list in
client/dbfuse-ai-client/src/app/lib/components/config/config.component.ts. - Keep README’s Supported AI Models table in sync (or ask maintainers to update).
- Create a strategy in
src/config/db_strategies/(follow patterns from existing drivers). - Register it in the connection manager.
- Add a minimal connectivity test under
src/config/tests/.
- Run all DB connectivity tests:
npm run test:all(requires accessible databases). - Lint/format:
npm run check(eslint + prettier) ornpm run lint:fix/npm run format:fix.
- Version bump in package.json, then create a git tag
vX.Y.Zand push the tag. - CI builds the Angular client, publishes the npm package, builds/pushes Docker image, and syncs Docker Hub README.
- Publishing is gated to tag builds; main runs build/verify only.
basic_working.mp4
DBFuse AI is distributed under the MIT License. This license permits commercial use, modification, distribution, and private use, with the requirement to include the original copyright and license notice.
