Skip to content

Commit 364b242

Browse files
committed
Ollama/MCPHost example
1 parent 6aafa9e commit 364b242

File tree

1 file changed

+185
-0
lines changed

1 file changed

+185
-0
lines changed

docs/use-cases/AI_ML/MCP/ollama.md

Lines changed: 185 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,185 @@
1+
---
2+
slug: /use-cases/AI/MCP/ollama
3+
sidebar_label: 'Ollama and ClickHouse MCP'
4+
title: 'Set Up ClickHouse MCP Server with Ollama'
5+
pagination_prev: null
6+
pagination_next: null
7+
description: 'This guide explains how to set up Ollama with a ClickHouse MCP server.'
8+
keywords: ['AI', 'Ollama', 'MCP']
9+
show_related_blogs: true
10+
---
11+
12+
import {CardHorizontal} from '@clickhouse/click-ui/bundled'
13+
import Link from '@docusaurus/Link';
14+
import Image from '@theme/IdealImage';
15+
16+
# Using ClickHouse MCP server with Ollama
17+
18+
> This guide explains how to use the ClickHouse MCP Server with Ollama.
19+
20+
<VerticalStepper headerLevel="h2" />
21+
22+
## Install Ollama {#install-ollama}
23+
24+
Ollama is a library for running Large Language Models (LLMs) on your own machine.
25+
It has a [wide range of models available](https://ollama.com/library) and is easy to use.
26+
27+
You can download Ollama for Mac, Windows, or Linux from the [download page](https://ollama.com/download).
28+
29+
Once you run Ollama, it will start a local server in the background that you can use to run models.
30+
Alternatively, you can run the server manually by running `ollama serve`.
31+
32+
Once installed, you can pull a model down to your machine like this:
33+
34+
```bash
35+
ollama pull qwen3:8b
36+
```
37+
38+
This will pull the model to your local machine if it is not present.
39+
Once it's downloaded, you can run the model like this:
40+
41+
```bash
42+
ollama run qwen3:8b
43+
```
44+
45+
:::note
46+
Only [models that have tool support](https://ollama.com/search?c=tools) will work with MCP Servers.
47+
:::
48+
49+
We can list the models that we have downloaded like this:
50+
51+
```bash
52+
ollama ls
53+
```
54+
55+
```text
56+
NAME ID SIZE MODIFIED
57+
qwen3:latest 500a1f067a9f 5.2 GB 3 days ago
58+
```
59+
60+
## Install MCPHost {#install-mcphost}
61+
62+
At the time of writing (July 2025) there is no native functionality for using Ollama with MCP Servers.
63+
However, we can use [MCPHost](https://github.com/mark3labs/mcphost) to run Ollama models with MCP Servers.
64+
65+
MCPHost is a Go application, so you'll need to make sure that you have [Go installed](https://go.dev/doc/install) on your machine.
66+
You can then install MCPHost by running the following command:
67+
68+
```bash
69+
go install github.com/mark3labs/mcphost@latest
70+
```
71+
72+
The binary will be installed under `~/go/bin` so we need to make sure that directory is on our path.
73+
74+
## Configuring ClickHouse MCP Server {#configure-clickhouse-mcp-server}
75+
76+
We can configure MCP Servers with MCPHost in YAML or JSON files.
77+
MCPHost will look for config files in your home directory the following order:
78+
79+
1. `.mcphost.yml` or `.mcphost.json` (preferred)
80+
2. `.mcp.yml` or `.mcp.json` (backwards compatibility)
81+
82+
It uses a syntax that's similar to that used in the standard MCP configuration file.
83+
Here's an example of a ClickHouse MCP server configuration, which we'll save to the `~/.mcphost.json` file:
84+
85+
```json
86+
{
87+
"mcpServers": {
88+
"mcp-ch": {
89+
"type": "local",
90+
"command": ["uv",
91+
"run",
92+
"--with",
93+
"mcp-clickhouse",
94+
"--python",
95+
"3.10",
96+
"mcp-clickhouse"
97+
]
98+
}
99+
}
100+
}
101+
```
102+
103+
The main difference from the standard MCP configuration file is that we need to specify a `type`.
104+
The type is used to indicate the transport type used by the MCP Server.
105+
106+
* `local` → stdio transport
107+
* `remote` → streamable transport
108+
* `builtin` → inprocess transport
109+
110+
We'll also need to configure the following environment variables:
111+
112+
```bash
113+
export CLICKHOUSE_HOST=sql-clickhouse.clickhouse.com
114+
export CLICKHOUSE_USER=demo
115+
export CLICKHOUSE_PASSWORD=""
116+
```
117+
118+
:::note
119+
In theory, you should be able to provide these variables under the `environment` key in the MCP configuration file, but we've found that this doesn't work.
120+
:::
121+
122+
## Running MCPHost {#running-mcphost}
123+
124+
Once you've configured the ClickHouse MCP server, you can run MCPHost by running the following command:
125+
126+
```bash
127+
mcphost --model ollama:qwen3
128+
```
129+
130+
Or, if you want to have it use a specific config file:
131+
132+
```bash
133+
mcphost --model ollama:qwen3 --config ~/.mcphost.json
134+
```
135+
136+
:::warning
137+
If you don't provide `--model`, MCPHost will look in the environment variables for `ANTHROPIC_API_KEY` and will use the `anthropic:claude-sonnet-4-20250514` model.
138+
:::
139+
140+
We should see the following output:
141+
142+
```text
143+
┃ ┃
144+
┃ Model loaded: ollama (qwen3) ┃
145+
┃ MCPHost System (09:52) ┃
146+
┃ ┃
147+
148+
┃ ┃
149+
┃ Model loaded successfully on GPU ┃
150+
┃ MCPHost System (09:52) ┃
151+
┃ ┃
152+
153+
┃ ┃
154+
┃ Loaded 3 tools from MCP servers ┃
155+
┃ MCPHost System (09:52) ┃
156+
┃ ┃
157+
158+
Enter your prompt (Type /help for commands, Ctrl+C to quit, ESC to cancel generation)
159+
```
160+
161+
We can use the `/servers` command to list the MCP Servers:
162+
163+
```text
164+
┃ ┃
165+
┃ ## Configured MCP Servers ┃
166+
┃ ┃
167+
┃ 1. mcp-ch ┃
168+
┃ MCPHost System (10:00) ┃
169+
170+
```
171+
172+
And `/tools` to list the tools available:
173+
174+
```text
175+
┃ ## Available Tools ┃
176+
┃ ┃
177+
┃ 1. mcp-ch__list_databases ┃
178+
┃ 2. mcp-ch__list_tables ┃
179+
┃ 3. mcp-ch__run_select_query
180+
```
181+
182+
We can then ask the model questions about the databases/tables available in the ClickHouse SQL playground.
183+
184+
In our experience when using smaller models (the default qwen3 model has 8 billion parameters), you'll need to be more specific about what you'd like it to do.
185+
For example, you'll need to explicitly ask it to list the databases and tables rather than straight away asking it to query a certain table.

0 commit comments

Comments
 (0)