Skip to content

Commit f6c43b7

Browse files
committed
feat: update docs
1 parent 19cabc8 commit f6c43b7

File tree

2 files changed

+111
-106
lines changed

2 files changed

+111
-106
lines changed

.github/release-please/config.json

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,5 +14,8 @@
1414
],
1515
"packages": {
1616
".": {}
17-
}
17+
},
18+
"extra-files": [
19+
"README.md"
20+
]
1821
}

README.md

Lines changed: 107 additions & 105 deletions
Original file line numberDiff line numberDiff line change
@@ -10,135 +10,156 @@
1010
A Model Context Protocol implementation that connects your LLM to Firebolt Data Warehouse
1111
</h4>
1212

13-
<p align="center">
14-
<a href="https://github.com/firebolt-db/mcp-server/releases">
15-
<img src="https://img.shields.io/github/v/release/firebolt-db/mcp-server" alt="Release">
16-
</a>
17-
<a href="https://github.com/firebolt-db/mcp-server/blob/main/LICENSE">
18-
<img src="https://img.shields.io/github/license/firebolt-db/mcp-server" alt="License">
19-
</a>
20-
<a href="https://go.dev">
21-
<img src="https://img.shields.io/badge/go-1.24.1-blue" alt="Go Version">
22-
</a>
23-
<a href="https://github.com/firebolt-db/mcp-server/actions/workflows/build.yml">
24-
<img src="https://img.shields.io/github/actions/workflow/status/firebolt-db/mcp-server/build.yml" alt="Build Status">
25-
</a>
26-
</p>
27-
2813
<p align="center">
2914
<a href="#key-features">Key Features</a> |
3015
<a href="#how-to-use">How To Use</a> |
31-
<a href="#requirements">Requirements</a> |
16+
<a href="#connecting-your-llm">Connecting Your LLM</a> |
3217
<a href="#architecture">Architecture</a> |
33-
<a href="#development">Development</a> |
34-
<a href="#license">License</a>
18+
<a href="#development">Development</a>
3519
</p>
3620

37-
![screenshot](https://img.example.firebolt.io/mcp-server-demo.gif)
38-
3921
## Key Features
4022

41-
* **LLM Integration with Firebolt** - Connect your AI assistants directly to your data warehouse
42-
- Enable AI agents to autonomously query your data and build analytics solutions
43-
- Provide LLMs with specialized knowledge of Firebolt's capabilities and features
23+
**LLM Integration with Firebolt**
24+
25+
- Connect your AI assistants directly to your data warehouse
26+
- Enable AI agents to autonomously query data and generate insights
27+
- Provide LLMs with deep knowledge of Firebolt SQL, features, and documentation
28+
29+
**SQL Query Execution**
30+
31+
- Support for multiple query types and execution modes
32+
- Direct access to Firebolt databases
33+
34+
**Documentation Access**
35+
36+
- Grant LLMs access to comprehensive Firebolt docs, SQL reference, function lists, and more
4437

45-
* **SQL Query Execution**
46-
- Direct query execution against Firebolt databases
47-
- Support for multiple query types and execution modes
38+
**Account Management**
4839

49-
* **Documentation Access**
50-
- Comprehensive Firebolt documentation available to the LLM
51-
- SQL reference, function reference, and more
40+
- Seamless authentication with Firebolt service accounts
41+
- Connect to different engines and workspaces
5242

53-
* **Account Management**
54-
- Connect to different accounts and engines
55-
- Manage authentication seamlessly
43+
**Multi-platform Support**
5644

57-
* **Multi-platform Support**
58-
- Run on any platform supporting Go binaries
59-
- Docker container support for easy deployment
45+
- Runs anywhere Go binaries are supported
46+
- Official Docker image available for easy deployment
6047

6148
## How To Use
6249

63-
To get started with the Firebolt MCP Server, you'll need a Firebolt service account. If you don't have a Firebolt account yet, [sign up here](https://www.firebolt.io/signup).
50+
Before you start, ensure you have a Firebolt [service account](https://docs.firebolt.io/Guides/managing-your-organization/service-accounts.html) with a client ID and client secret.
6451

65-
### Option 1: Use the Docker image
52+
### Installing the MCP Server
6653

54+
You can run the Firebolt MCP Server either via Docker or by downloading the binary.
55+
56+
#### Option 1: Run with Docker
57+
58+
[//]: # (x-release-please-start-version)
6759
```bash
68-
# Run with Docker
69-
docker run -p 8080:8080 \
60+
docker run \
61+
--rm \
7062
-e FIREBOLT_MCP_CLIENT_ID=your-client-id \
7163
-e FIREBOLT_MCP_CLIENT_SECRET=your-client-secret \
72-
-e FIREBOLT_MCP_TRANSPORT=sse \
73-
firebolt/mcp-server:latest
64+
ghcr.io/firebolt-db/mcp-server:0.2.0
7465
```
66+
[//]: # (x-release-please-end)
7567

76-
### Option 2: Download and run the binary
68+
#### Option 2: Run the Binary
7769

70+
[//]: # (x-release-please-start-version)
7871
```bash
79-
# Download the latest release for your platform from:
80-
# https://github.com/firebolt-db/mcp-server/releases
72+
# Download the binary for your OS from:
73+
# https://github.com/firebolt-db/mcp-server/releases/tag/v0.2.0
8174

82-
# Run the server
8375
./firebolt-mcp-server \
8476
--client-id your-client-id \
85-
--client-secret your-client-secret \
86-
--transport sse
77+
--client-secret your-client-secret
8778
```
79+
[//]: # (x-release-please-end)
8880

89-
### Connecting your LLM
81+
### Connecting Your LLM
9082

91-
Once the server is running, you can connect to it using any MCP-compatible client. For example:
83+
Once the MCP Server is installed, you can connect various LLM clients.
9284

93-
```bash
94-
# Using the OpenAI API with MCP extension
95-
curl -X POST https://api.openai.com/v1/chat/completions \
96-
-H "Content-Type: application/json" \
97-
-H "Authorization: Bearer $OPENAI_API_KEY" \
98-
-d '{
99-
"model": "gpt-4",
100-
"messages": [
101-
{"role": "system", "content": "You are a data analyst working with Firebolt."},
102-
{"role": "user", "content": "How many users did we have last month?"}
103-
],
104-
"tools": [
105-
{
106-
"type": "mcp",
107-
"mcp": {
108-
"endpoint": "http://localhost:8080",
109-
"auth": {
110-
"type": "bearer",
111-
"token": "YOUR_TOKEN"
85+
Below are integration examples for **Claude**.
86+
For other clients like **GitHub Copilot Chat** and **Cursor**, please refer to their official documentation.
87+
88+
#### Claude Desktop
89+
90+
To integrate with Claude Desktop using **Docker**:
91+
92+
1. Open the Claude menu and select **Settings…**.
93+
2. Navigate to **Developer** > **Edit Config**.
94+
3. Update the configuration file (`claude_desktop_config.json`) to include:
95+
96+
[//]: # (x-release-please-start-version)
97+
```json
98+
{
99+
"mcpServers": {
100+
"firebolt": {
101+
"command": "docker",
102+
"args": [
103+
"run",
104+
"-i",
105+
"--rm",
106+
"-e FIREBOLT_MCP_CLIENT_ID=your-client-id",
107+
"-e FIREBOLT_MCP_CLIENT_SECRET=your-client-secret",
108+
"ghcr.io/firebolt-db/mcp-server:0.2.0"
109+
]
110+
}
111+
}
112+
}
113+
```
114+
[//]: # (x-release-please-end)
115+
116+
To use the **binary** instead of Docker:
117+
118+
```json
119+
{
120+
"mcpServers": {
121+
"firebolt": {
122+
"command": "/path/to/firebolt-mcp-server",
123+
"env": {
124+
"FIREBOLT_MCP_CLIENT_ID": "your-client-id",
125+
"FIREBOLT_MCP_CLIENT_SECRET": "your-client-secret"
112126
}
113127
}
114128
}
115-
]
116-
}'
117-
```
129+
}
130+
```
131+
132+
4. Save the config and restart Claude Desktop.
133+
134+
More details: [Claude MCP Quickstart Guide](https://modelcontextprotocol.io/quickstart/user)
135+
136+
#### GitHub Copilot Chat (VSCode)
137+
138+
To integrate MCP with Copilot Chat in VSCode, refer to the official documentation:
118139

119-
## Requirements
140+
👉 [Extending Copilot Chat with the Model Context Protocol](https://docs.github.com/en/copilot/customizing-copilot/extending-copilot-chat-with-mcp)
120141

121-
- Firebolt service account credentials (client ID and client secret)
122-
- For development: Go 1.24.1 or later
123-
- For deployment: Docker (optional)
142+
#### Cursor Editor
143+
144+
To set up MCP in Cursor, follow their guide:
145+
146+
👉 [Cursor Documentation on Model Context Protocol](https://docs.cursor.com/context/model-context-protocol)
124147

125148
## Architecture
126149

127-
The Firebolt MCP Server implements the [Model Context Protocol](https://github.com/anthropics/anthropic-cookbook/tree/main/model_context_protocol) specification, providing:
150+
Firebolt MCP Server implements the [Model Context Protocol](https://modelcontextprotocol.io/introduction), providing:
128151

129152
1. **Tools** - Task-specific capabilities provided to the LLM:
130-
- `Connect`: Establish connections to Firebolt engines and databases
131-
- `Docs`: Access Firebolt documentation
132-
- `Query`: Execute SQL queries against Firebolt
153+
- `firebolt_docs`: Access Firebolt documentation
154+
- `firebolt_connect`: Establish connections to Firebolt engines and databases
155+
- `firebolt_query`: Execute SQL queries against Firebolt
133156

134157
2. **Resources** - Data that can be referenced by the LLM:
135-
- Documentation articles
136-
- Account information
137-
- Database schema
138-
- Engine statistics
158+
- Documentation articles
159+
- Lists of Accounts, Databases, Engines
139160

140161
3. **Prompts** - Predefined instructions for the LLM:
141-
- Firebolt Expert: Prompts the model to act as a Firebolt specialist
162+
- Firebolt Expert: Prompts the model to act as a Firebolt specialist
142163

143164
## Development
144165

@@ -159,26 +180,7 @@ task mod
159180

160181
# Build the application
161182
task build
162-
```
163-
164-
### Running tests
165183

166-
```bash
167-
go test ./...
168-
```
169-
170-
### Building Docker image
171-
172-
```bash
173-
docker build -t firebolt-mcp-server .
184+
# Run the tests
185+
tast test
174186
```
175-
176-
## License
177-
178-
MIT
179-
180-
---
181-
182-
> [firebolt.io](https://www.firebolt.io) &nbsp;&middot;&nbsp;
183-
> GitHub [@firebolt-db](https://github.com/firebolt-db) &nbsp;&middot;&nbsp;
184-
> Twitter [@FireboltDB](https://twitter.com/FireboltDB)

0 commit comments

Comments
 (0)