Skip to content

Commit 088cc1b

Browse files
committed
add LibreChat MCP guide
1 parent 3fbd0b4 commit 088cc1b

File tree

4 files changed

+258
-1
lines changed

4 files changed

+258
-1
lines changed

docs/use-cases/AI_ML/MCP/librechat.md

Lines changed: 244 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,244 @@
1+
---
2+
slug: /use-cases/AI/MCP/librechat
3+
sidebar_label: 'LibreChat and ClickHouse MCP'
4+
title: 'Set Up ClickHouse MCP Server with LibreChat and ClickHouse Cloud'
5+
pagination_prev: null
6+
pagination_next: null
7+
description: 'This guide explains how to set up LibreChat with a ClickHouse MCP server using Docker.'
8+
keywords: ['AI', 'Librechat', 'MCP']
9+
show_related_blogs: true
10+
---
11+
12+
import {CardHorizontal} from '@clickhouse/click-ui/bundled'
13+
import Link from '@docusaurus/Link';
14+
import Image from '@theme/IdealImage';
15+
import LibreInterface from '@site/static/images/use-cases/AI_ML/MCP/librechat.png';
16+
17+
# Using ClickHouse MCP server with LibreChat
18+
19+
> This guide explains how to set up LibreChat with a ClickHouse MCP server using Docker
20+
> and connect it to the ClickHouse example datasets.
21+
22+
<VerticalStepper headerLevel="h2">
23+
24+
## Install docker {#install-docker}
25+
26+
You will need Docker to run LibreChat and the MCP server. To get Docker:
27+
1. Visit [docker.com](https://www.docker.com/products/docker-desktop)
28+
2. Download Docker desktop for your operating system
29+
3. Install Docker by following the instructions for your operating system
30+
4. Open Docker Desktop and ensure it is running
31+
<br/>
32+
For more information, see the [Docker documentation](https://docs.docker.com/get-docker/).
33+
34+
## Clone LibreChat repository {#clone-librechat-repo}
35+
36+
Open a terminal (command prompt, terminal or PowerShell) and clone the
37+
LibreChat repository using the following command:
38+
39+
```bash
40+
git clone https://github.com/danny-avila/LibreChat.git
41+
cd LibreChat
42+
```
43+
44+
## Create and edit the .env file {#create-and-edit-env-file}
45+
46+
Copy the example configuration file from `.env.example` to `.env`:
47+
48+
```bash
49+
cp .env.example .env
50+
```
51+
52+
Open the `.env` file in your favorite text editor. You will see sections for
53+
many popular LLM providers, including OpenAI, Anthropic, AWS bedrock etc, for
54+
example:
55+
56+
```text title=".venv"
57+
#============#
58+
# Anthropic #
59+
#============#
60+
#highlight-next-line
61+
ANTHROPIC_API_KEY=user_provided
62+
# ANTHROPIC_MODELS=claude-opus-4-20250514,claude-sonnet-4-20250514,claude-3-7-sonnet-20250219,claude-3-5-sonnet-20241022,claude-3-5-haiku-20241022,claude-3-opus-20240229,claude-3-sonnet-20240229,claude-3-haiku-20240307
63+
# ANTHROPIC_REVERSE_PROXY=
64+
```
65+
66+
Replace `user_provided` with your API key for the LLM provider you want to use,
67+
making sure it is quoted, e.g. `"A2bC3d4e5f6g7h8i9j0k1l2m3n4o5p6q7r8s9t0u1v2w3x4y5z"`.
68+
69+
:::note Using a local LLM
70+
If you don't have an API key you can use a local LLM like Ollama. You'll see how
71+
to do this later in step ["Install Ollama"](#add-local-llm-using-ollama). For now
72+
don't modify the .env file and continue with the next steps.
73+
:::
74+
75+
## Create a librechat.yaml file {#create-librechat-yaml-file}
76+
77+
Run the following command to create a new `librechat.yaml` file:
78+
79+
```bash
80+
cp librechat.example.yaml librechat.yaml
81+
```
82+
83+
This creates the main [configuration file](https://www.librechat.ai/docs/configuration/librechat_yaml) for LibreChat.
84+
85+
## Add ClickHouse MCP server to Docker compose {#add-clickhouse-mcp-server-to-docker-compose}
86+
87+
Next we'll add the ClickHouse MCP server to the LibreChat Docker compose file
88+
called `docker-compose.yaml` so that the LLM can interact with the
89+
[ClickHouse SQL playground](https://sql.clickhouse.com/).
90+
91+
Find the `services` section in the `docker-compose.yaml` file and add
92+
`mcp-clickhouse` as a new service with the following configuration:
93+
94+
```yml title="docker-compose.yaml"
95+
mcp-clickhouse:
96+
image: mcp/clickhouse
97+
container_name: mcp-clickhouse
98+
ports:
99+
- 8001:8000
100+
extra_hosts:
101+
- "host.docker.internal:host-gateway"
102+
environment:
103+
- CLICKHOUSE_HOST=sql-clickhouse.clickhouse.com
104+
- CLICKHOUSE_USER=demo
105+
- CLICKHOUSE_PASSWORD=
106+
- CLICKHOUSE_MCP_SERVER_TRANSPORT=sse
107+
- CLICKHOUSE_MCP_BIND_HOST=0.0.0.0
108+
```
109+
110+
If you want to explore your own data, you can do so by
111+
using the [host, username and password](https://clickhouse.com/docs/getting-started/quick-start/cloud#connect-with-your-app)
112+
of your own ClickHouse Cloud service.
113+
114+
<Link to="https://cloud.clickhouse.com/">
115+
<CardHorizontal
116+
badgeIcon="cloud"
117+
badgeIconDir=""
118+
badgeState="default"
119+
badgeText=""
120+
description="
121+
If you don't have a Cloud account yet, get started with ClickHouse Cloud today and
122+
receive $300 in credits. At the end of your 30-day free trial, continue with a
123+
pay-as-you-go plan, or contact us to learn more about our volume-based discounts.
124+
Visit our pricing page for details.
125+
"
126+
icon="cloud"
127+
infoText=""
128+
infoUrl=""
129+
title="Get started with ClickHouse Cloud"
130+
isSelected={true}
131+
/>
132+
</Link>
133+
134+
## Mount the librechat.yaml file {#mount-librechat-yaml-file}
135+
136+
In `docker-compose.yml`, find the API service:
137+
138+
```yml title="docker-compose.yaml"
139+
services:
140+
#highlight-next-line
141+
api:
142+
container_name: LibreChat
143+
ports:
144+
- "${PORT}:${PORT}"
145+
146+
#highlight-next-line
147+
volumes:
148+
- type: bind
149+
source: ./.env
150+
target: /app/.env
151+
- ./images:/app/client/public/images
152+
- ./uploads:/app/uploads
153+
- ./logs:/app/api/logs
154+
```
155+
156+
Under the volumes section, add the following line:
157+
158+
```yml
159+
- ./librechat.yaml:/librechat/librechat.yaml
160+
```
161+
162+
This will make the configuration file available to the backend service.
163+
164+
## Configure MCP server in librechat.yaml {#configure-mcp-server-in-librechat-yaml}
165+
166+
Open `librechat.yaml` and place the following configuration at the end of the file:
167+
168+
```yml
169+
mcpServers:
170+
clickhouse-playground:
171+
type: sse
172+
url: http://host.docker.internal:8001/sse
173+
```
174+
175+
This configures LibreChat to connect to the MCP server running on Docker.
176+
177+
## Add a local LLM using Ollama (optional) {#add-local-llm-using-ollama}
178+
179+
### Install Ollama {#install-ollama}
180+
181+
Go to the [Ollama website](https://ollama.com/download) and install Ollama for your system.
182+
183+
Once installed, you can run a model like this:
184+
185+
```bash
186+
ollama run qwen3:32b
187+
```
188+
189+
This will pull the model to your local machine if it is not present.
190+
191+
For a list of models see the [Ollama library](https://ollama.com/library)
192+
193+
### Configure Ollama in librechat.yaml {#configure-ollama-in-librechat-yaml}
194+
195+
Once the model has downloaded, configure it in `librechat.yaml`:
196+
197+
```text title="librechat.yaml"
198+
custom:
199+
- name: "Ollama"
200+
apiKey: "ollama"
201+
baseURL: "http://host.docker.internal:11434/v1/"
202+
models:
203+
default:
204+
[
205+
"qwen3:32b"
206+
]
207+
fetch: false
208+
titleConvo: true
209+
titleModel: "current_model"
210+
summarize: false
211+
summaryModel: "current_model"
212+
forcePrompt: false
213+
modelDisplayLabel: "Ollama"
214+
```
215+
216+
## Start all services {#start-all-services}
217+
218+
From the root of the LibreChat project folder, run the following command to start the services:
219+
220+
```bash
221+
docker compose -f docker-compose.yml up
222+
```
223+
224+
Wait until all services are fully running.
225+
226+
## Open LibreChat in your browser {#open-librechat-in-browser}
227+
228+
Once all services are up and running, open your browser and go to `http://localhost:3080/`
229+
230+
Create a free LibreChat account if you don't yet have one, and sign in. You should
231+
now see the LibreChat interface connected to the ClickHouse MCP server, and optionally,
232+
your local LLM.
233+
234+
From the chat interface, select `clickhouse-playground` as your MCP server:
235+
236+
<Image img={LibreInterface} alt="Select your MCP server" size="md"/>
237+
238+
You can now prompt the LLM to explore the ClickHouse example datasets. Give it a go:
239+
240+
```text title="Prompt"
241+
What datasets do you have access to?
242+
```
243+
244+
</VerticalStepper>

docs/use-cases/AI_ML/index.md

Whitespace-only changes.

sidebars.js

Lines changed: 14 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -169,7 +169,20 @@ const sidebars = {
169169
"use-cases/data_lake/glue_catalog",
170170
"use-cases/data_lake/unity_catalog"
171171
]
172-
}
172+
},
173+
{
174+
type: "category",
175+
label: "AI/ML",
176+
collapsed: true,
177+
collapsible: true,
178+
link: { type: "doc", id: "use-cases/AI_ML/index" },
179+
items: [
180+
{
181+
type: "autogenerated",
182+
dirName: "use-cases/AI_ML",
183+
}
184+
]
185+
},
173186
]
174187
},
175188
{
69 KB
Loading

0 commit comments

Comments
 (0)