Skip to content

Commit e5a124a

Browse files
authored
Merge pull request #4034 from Blargian/mcp_docs
Docs: add LibreChat MCP guide
2 parents b1a3289 + d50b755 commit e5a124a

File tree

8 files changed

+336
-69
lines changed

8 files changed

+336
-69
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,7 @@ docs/getting-started/index.md
6262
docs/data-modeling/projections/index.md
6363
docs/cloud/manage/jan2025_faq/index.md
6464
docs/chdb/guides/index.md
65+
docs/use-cases/AI_ML/index.md
6566

6667
.vscode
6768
.aspell.en.prepl

docs/use-cases/AI_ML/MCP/librechat.md

Lines changed: 228 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,228 @@
1+
---
2+
slug: /use-cases/AI/MCP/librechat
3+
sidebar_label: 'LibreChat and ClickHouse MCP'
4+
title: 'Set Up ClickHouse MCP Server with LibreChat and ClickHouse Cloud'
5+
pagination_prev: null
6+
pagination_next: null
7+
description: 'This guide explains how to set up LibreChat with a ClickHouse MCP server using Docker.'
8+
keywords: ['AI', 'Librechat', 'MCP']
9+
show_related_blogs: true
10+
---
11+
12+
import {CardHorizontal} from '@clickhouse/click-ui/bundled'
13+
import Link from '@docusaurus/Link';
14+
import Image from '@theme/IdealImage';
15+
import LibreInterface from '@site/static/images/use-cases/AI_ML/MCP/librechat.png';
16+
17+
# Using ClickHouse MCP server with LibreChat
18+
19+
> This guide explains how to set up LibreChat with a ClickHouse MCP server using Docker
20+
> and connect it to the ClickHouse example datasets.
21+
22+
<VerticalStepper headerLevel="h2">
23+
24+
## Install docker {#install-docker}
25+
26+
You will need Docker to run LibreChat and the MCP server. To get Docker:
27+
1. Visit [docker.com](https://www.docker.com/products/docker-desktop)
28+
2. Download Docker desktop for your operating system
29+
3. Install Docker by following the instructions for your operating system
30+
4. Open Docker Desktop and ensure it is running
31+
<br/>
32+
For more information, see the [Docker documentation](https://docs.docker.com/get-docker/).
33+
34+
## Clone the LibreChat repository {#clone-librechat-repo}
35+
36+
Open a terminal (command prompt, terminal or PowerShell) and clone the
37+
LibreChat repository using the following command:
38+
39+
```bash
40+
git clone https://github.com/danny-avila/LibreChat.git
41+
cd LibreChat
42+
```
43+
44+
## Create and edit the .env file {#create-and-edit-env-file}
45+
46+
Copy the example configuration file from `.env.example` to `.env`:
47+
48+
```bash
49+
cp .env.example .env
50+
```
51+
52+
Open the `.env` file in your favorite text editor. You will see sections for
53+
many popular LLM providers, including OpenAI, Anthropic, AWS bedrock etc, for
54+
example:
55+
56+
```text title=".venv"
57+
#============#
58+
# Anthropic #
59+
#============#
60+
#highlight-next-line
61+
ANTHROPIC_API_KEY=user_provided
62+
# ANTHROPIC_MODELS=claude-opus-4-20250514,claude-sonnet-4-20250514,claude-3-7-sonnet-20250219,claude-3-5-sonnet-20241022,claude-3-5-haiku-20241022,claude-3-opus-20240229,claude-3-sonnet-20240229,claude-3-haiku-20240307
63+
# ANTHROPIC_REVERSE_PROXY=
64+
```
65+
66+
Replace `user_provided` with your API key for the LLM provider you want to use.
67+
68+
:::note Using a local LLM
69+
If you don't have an API key you can use a local LLM like Ollama. You'll see how
70+
to do this later in step ["Install Ollama"](#add-local-llm-using-ollama). For now
71+
don't modify the .env file and continue with the next steps.
72+
:::
73+
74+
## Create a librechat.yaml file {#create-librechat-yaml-file}
75+
76+
Run the following command to create a new `librechat.yaml` file:
77+
78+
```bash
79+
cp librechat.example.yaml librechat.yaml
80+
```
81+
82+
This creates the main [configuration file](https://www.librechat.ai/docs/configuration/librechat_yaml) for LibreChat.
83+
84+
## Add ClickHouse MCP server to Docker compose {#add-clickhouse-mcp-server-to-docker-compose}
85+
86+
Next we'll add the ClickHouse MCP server to the LibreChat Docker compose file
87+
so that the LLM can interact with the
88+
[ClickHouse SQL playground](https://sql.clickhouse.com/).
89+
90+
Create a file called `docker-compose.override.yml` and add the following configuration to it:
91+
92+
```yml title="docker-compose.override.yml"
93+
services:
94+
api:
95+
volumes:
96+
- ./librechat.yaml:/app/librechat.yaml
97+
mcp-clickhouse:
98+
image: mcp/clickhouse
99+
container_name: mcp-clickhouse
100+
ports:
101+
- 8001:8000
102+
extra_hosts:
103+
- "host.docker.internal:host-gateway"
104+
environment:
105+
- CLICKHOUSE_HOST=sql-clickhouse.clickhouse.com
106+
- CLICKHOUSE_USER=demo
107+
- CLICKHOUSE_PASSWORD=
108+
- CLICKHOUSE_MCP_SERVER_TRANSPORT=sse
109+
- CLICKHOUSE_MCP_BIND_HOST=0.0.0.0
110+
```
111+
112+
If you want to explore your own data, you can do so by
113+
using the [host, username and password](https://clickhouse.com/docs/getting-started/quick-start/cloud#connect-with-your-app)
114+
of your own ClickHouse Cloud service.
115+
116+
<Link to="https://cloud.clickhouse.com/">
117+
<CardHorizontal
118+
badgeIcon="cloud"
119+
badgeIconDir=""
120+
badgeState="default"
121+
badgeText=""
122+
description="
123+
If you don't have a Cloud account yet, get started with ClickHouse Cloud today and
124+
receive $300 in credits. At the end of your 30-day free trial, continue with a
125+
pay-as-you-go plan, or contact us to learn more about our volume-based discounts.
126+
Visit our pricing page for details.
127+
"
128+
icon="cloud"
129+
infoText=""
130+
infoUrl=""
131+
title="Get started with ClickHouse Cloud"
132+
isSelected={true}
133+
/>
134+
</Link>
135+
136+
## Configure MCP server in librechat.yaml {#configure-mcp-server-in-librechat-yaml}
137+
138+
Open `librechat.yaml` and place the following configuration at the end of the file:
139+
140+
```yml
141+
mcpServers:
142+
clickhouse-playground:
143+
type: sse
144+
url: http://host.docker.internal:8001/sse
145+
```
146+
147+
This configures LibreChat to connect to the MCP server running on Docker.
148+
149+
Find the following line:
150+
151+
```text title="librechat.yaml"
152+
socialLogins: ['github', 'google', 'discord', 'openid', 'facebook', 'apple', 'saml']
153+
```
154+
155+
For simplicity, we will remove the need to authenticate for now:
156+
157+
```text title="librechat.yaml"
158+
socialLogins: []
159+
```
160+
161+
## Add a local LLM using Ollama (optional) {#add-local-llm-using-ollama}
162+
163+
### Install Ollama {#install-ollama}
164+
165+
Go to the [Ollama website](https://ollama.com/download) and install Ollama for your system.
166+
167+
Once installed, you can run a model like this:
168+
169+
```bash
170+
ollama run qwen3:32b
171+
```
172+
173+
This will pull the model to your local machine if it is not present.
174+
175+
For a list of models see the [Ollama library](https://ollama.com/library)
176+
177+
### Configure Ollama in librechat.yaml {#configure-ollama-in-librechat-yaml}
178+
179+
Once the model has downloaded, configure it in `librechat.yaml`:
180+
181+
```text title="librechat.yaml"
182+
custom:
183+
- name: "Ollama"
184+
apiKey: "ollama"
185+
baseURL: "http://host.docker.internal:11434/v1/"
186+
models:
187+
default:
188+
[
189+
"qwen3:32b"
190+
]
191+
fetch: false
192+
titleConvo: true
193+
titleModel: "current_model"
194+
summarize: false
195+
summaryModel: "current_model"
196+
forcePrompt: false
197+
modelDisplayLabel: "Ollama"
198+
```
199+
200+
## Start all services {#start-all-services}
201+
202+
From the root of the LibreChat project folder, run the following command to start the services:
203+
204+
```bash
205+
docker compose up
206+
```
207+
208+
Wait until all services are fully running.
209+
210+
## Open LibreChat in your browser {#open-librechat-in-browser}
211+
212+
Once all services are up and running, open your browser and go to `http://localhost:3080/`
213+
214+
Create a free LibreChat account if you don't yet have one, and sign in. You should
215+
now see the LibreChat interface connected to the ClickHouse MCP server, and optionally,
216+
your local LLM.
217+
218+
From the chat interface, select `clickhouse-playground` as your MCP server:
219+
220+
<Image img={LibreInterface} alt="Select your MCP server" size="md"/>
221+
222+
You can now prompt the LLM to explore the ClickHouse example datasets. Give it a go:
223+
224+
```text title="Prompt"
225+
What datasets do you have access to?
226+
```
227+
228+
</VerticalStepper>

docs/use-cases/AI_ML/index.md

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
---
2+
description: 'Landing page for Machine learning and GenAI use case guides'
3+
pagination_prev: null
4+
pagination_next: null
5+
slug: /use-cases/AI
6+
title: 'Machine learning and GenAI'
7+
keywords: ['machine learning', 'genAI', 'AI']
8+
---
9+
10+
# Machine learning and GenAI
11+
12+
ClickHouse is ideally suited as a real-time database to power Machine Learning workloads.
13+
With ClickHouse, it's easier than ever to unleash GenAI on your analytics data.
14+
In this section you'll find some guides around how ClickHouse is used for
15+
Machine Learning and GenAI.
16+
17+
<!--AUTOGENERATED_START-->
18+
<!--AUTOGENERATED_END-->

scripts/aspell-dict-file.txt

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1059,3 +1059,5 @@ microsoft
10591059
microsoft
10601060
--docs/use-cases/observability/clickstack/migration/elastic/migrating-data.md--
10611061
clickstack
1062+
--docs/use-cases/AI_ML/MCP/librechat.md--
1063+
librechat

0 commit comments

Comments
 (0)