Skip to content

Commit b911ec2

Browse files
committed
Add example AI agent integration guides
1 parent 23157f2 commit b911ec2

19 files changed

+1596
-1
lines changed

.gitignore

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,9 +64,9 @@ docs/cloud/manage/jan2025_faq/index.md
6464
docs/chdb/guides/index.md
6565
docs/use-cases/AI_ML/index.md
6666
docs/use-cases/AI_ML/MCP/index.md
67+
docs/use-cases/AI_ML/MCP/ai_agent_libraries/index.md
6768
docs/integrations/data-ingestion/clickpipes/kafka/index.md
6869

69-
7070
.vscode
7171
.aspell.en.prepl
7272
*.md.bak
File renamed without changes.
File renamed without changes.
File renamed without changes.
File renamed without changes.
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
{
2+
"position": 5,
3+
"label": "Integrate AI agent libraries",
4+
"collapsible": true,
5+
"collapsed": true,
6+
"link": {
7+
"type": "doc",
8+
"id": "use-cases/AI_ML/MCP/ai_agent_libraries/index"
9+
}
10+
}
Lines changed: 139 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,139 @@
1+
---
2+
slug: /use-cases/AI/MCP/ai-agent-libraries/agno
3+
sidebar_label: 'Integrate Agno'
4+
title: 'How to build an AI Agent with Agno and the ClickHouse MCP Server'
5+
pagination_prev: null
6+
pagination_next: null
7+
description: 'Learn how build an AI Agent with Agno and the ClickHouse MCP Server'
8+
keywords: ['ClickHouse', 'MCP', 'Agno']
9+
show_related_blogs: true
10+
---
11+
12+
# How to build an AI Agent with Agno and the ClickHouse MCP Server
13+
14+
In this guide you'll learn how to build an [Agno](https://github.com/agno-agi/agno) AI agent that can interact with
15+
[ClickHouse's SQL playground](https://sql.clickhouse.com/) using [ClickHouse's MCP Server](https://github.com/ClickHouse/mcp-clickhouse).
16+
17+
:::note Example notebook
18+
This example can be found as a notebook in the [examples repository](https://github.com/ClickHouse/examples/blob/main/ai/mcp/agno/agno.ipynb).
19+
:::
20+
21+
## Prerequisites {#prerequisites}
22+
- You'll need to have python installed on your system.
23+
- You'll need to have `pip` installed on your system.
24+
- You'll need an Anthropic API key, or API key from another LLM provider
25+
26+
You can run the following steps either from your python REPL or via script.
27+
28+
<VerticalStepper headerLevel="h2">
29+
30+
## Install libraries {#install-libraries}
31+
32+
Install the Agno library by running the following commands:
33+
34+
```python
35+
!pip install -q --upgrade pip
36+
!pip install -q agno
37+
!pip install -q ipywidgets
38+
```
39+
40+
## Setup credentials {#setup-credentials}
41+
42+
Next, you'll need to provide your Anthropic API key:
43+
44+
```python
45+
import os, getpass
46+
os.environ["ANTHROPIC_API_KEY"] = getpass.getpass("Enter Anthropic API Key:")
47+
```
48+
49+
```response title="Response"
50+
Enter Anthropic API Key: ········
51+
```
52+
53+
:::note Using another LLM provider
54+
If you don't have an Anthropic API key, and want to use another LLM provider,
55+
you can find the instructions for setting up your credentials in the [DSPy docs](https://dspy.ai/#__tabbed_1_1)
56+
:::
57+
58+
Next, define the credentials needed to connect to the ClickHouse SQL playground:
59+
60+
```python
61+
env = {
62+
"CLICKHOUSE_HOST": "sql-clickhouse.clickhouse.com",
63+
"CLICKHOUSE_PORT": "8443",
64+
"CLICKHOUSE_USER": "demo",
65+
"CLICKHOUSE_PASSWORD": "",
66+
"CLICKHOUSE_SECURE": "true"
67+
}
68+
```
69+
70+
## Initialize MCP Server and Agno agent {#initialize-mcp-and-agent}
71+
72+
Now configure the ClickHouse MCP Server to point at the ClickHouse SQL playground
73+
and also initialize our Agno agent and ask it a question:
74+
75+
```python
76+
from agno.agent import Agent
77+
from agno.tools.mcp import MCPTools
78+
from agno.models.anthropic import Claude
79+
```
80+
81+
```python
82+
async with MCPTools(command="uv run --with mcp-clickhouse --python 3.13 mcp-clickhouse", env=env, timeout_seconds=60) as mcp_tools:
83+
agent = Agent(
84+
model=Claude(id="claude-3-5-sonnet-20240620"),
85+
markdown=True,
86+
tools = [mcp_tools]
87+
)
88+
await agent.aprint_response("What's the most starred project in 2025?", stream=True)
89+
```
90+
91+
```response title="Response"
92+
▰▱▱▱▱▱▱ Thinking...
93+
┏━ Message ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
94+
┃ ┃
95+
┃ What's the most starred project in 2025? ┃
96+
┃ ┃
97+
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
98+
┏━ Tool Calls ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
99+
┃ ┃
100+
┃ • list_tables(database=github, like=%) ┃
101+
┃ • run_select_query(query=SELECT ┃
102+
┃ repo_name, ┃
103+
┃ SUM(count) AS stars_2025 ┃
104+
┃ FROM github.repo_events_per_day ┃
105+
┃ WHERE event_type = 'WatchEvent' ┃
106+
┃ AND created_at >= '2025-01-01' ┃
107+
┃ AND created_at < '2026-01-01' ┃
108+
┃ GROUP BY repo_name ┃
109+
┃ ORDER BY stars_2025 DESC ┃
110+
┃ LIMIT 1) ┃
111+
┃ ┃
112+
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
113+
┏━ Response (34.9s) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
114+
┃ ┃
115+
┃ To answer your question about the most starred project in 2025, I'll need to query the ClickHouse database. ┃
116+
┃ However, before I can do that, I need to gather some information and make sure we're looking at the right data. ┃
117+
┃ Let me check the available databases and tables first.Thank you for providing the list of databases. I can see ┃
118+
┃ that there's a "github" database, which is likely to contain the information we're looking for. Let's check the ┃
119+
┃ tables in this database.Now that we have information about the tables in the github database, we can query the ┃
120+
┃ relevant data to answer your question about the most starred project in 2025. We'll use the repo_events_per_day ┃
121+
┃ table, which contains daily event counts for each repository, including star events (WatchEvents). ┃
122+
┃ ┃
123+
┃ Let's create a query to find the most starred project in 2025:Based on the query results, I can answer your ┃
124+
┃ question about the most starred project in 2025: ┃
125+
┃ ┃
126+
┃ The most starred project in 2025 was deepseek-ai/DeepSeek-R1, which received 84,962 stars during that year. ┃
127+
┃ ┃
128+
┃ This project, DeepSeek-R1, appears to be an AI-related repository from the DeepSeek AI organization. It gained ┃
129+
┃ significant attention and popularity among the GitHub community in 2025, earning the highest number of stars ┃
130+
┃ for any project during that year. ┃
131+
┃ ┃
132+
┃ It's worth noting that this data is based on the GitHub events recorded in the database, and it represents the ┃
133+
┃ stars (WatchEvents) accumulated specifically during the year 2025. The total number of stars for this project ┃
134+
┃ might be higher if we consider its entire lifespan. ┃
135+
┃ ┃
136+
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
137+
```
138+
139+
</VerticalStepper>
Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
---
2+
slug: /use-cases/AI/MCP/ai-agent-libraries/chainlit
3+
sidebar_label: 'Integrate Chainlit'
4+
title: 'How to build an AI Agent with Chainlit and the ClickHouse MCP Server'
5+
pagination_prev: null
6+
pagination_next: null
7+
description: 'Learn how to use Chainlit to build LLM-based chat apps together with the ClickHouse MCP Server'
8+
keywords: ['ClickHouse', 'MCP', 'Chainlit']
9+
show_related_blogs: true
10+
---
11+
12+
# Chainlit and the ClickHouse MCP Server
13+
14+
This guide explores how to combine Chainlit's powerful chat interface framework
15+
with the ClickHouse Model Context Protocol (MCP) Server to create interactive data
16+
applications. Chainlit enables you to build conversational interfaces for AI
17+
applications with minimal code, while the ClickHouse MCP Server provides seamless
18+
integration with ClickHouse's high-performance columnar database.
19+
20+
## Prerequisites {#prerequisites}
21+
- You'll need an Anthropic API key
22+
- You'll need to have [`uv`](https://docs.astral.sh/uv/getting-started/installation/) installed
23+
24+
## Basic Chainlit app {#basic-chainlit-app}
25+
26+
You can see an example of a basic chat app by running the following:
27+
28+
```sh
29+
uv run --with anthropic --with chainlit chainlit run chat_basic.py -w -h
30+
```
31+
32+
Then navigate to `http://localhost:8000`
33+
34+
## Adding ClickHouse MCP Server {#adding-clickhouse-mcp-server}
35+
36+
Things get more interesting if we add the ClickHouse MCP Server.
37+
You'll need to update your `.chainlit/config.toml` file to allow the `uv` command
38+
to be used:
39+
40+
```toml
41+
[features.mcp.stdio]
42+
enabled = true
43+
# Only the executables in the allow list can be used for MCP stdio server.
44+
# Only need the base name of the executable, e.g. "npx", not "/usr/bin/npx".
45+
# Please don't comment this line for now, we need it to parse the executable name.
46+
allowed_executables = [ "npx", "uvx", "uv" ]
47+
```
48+
49+
:::note config.toml
50+
Find the full `config.toml` file in the [examples repository](https://github.com/ClickHouse/examples/blob/main/ai/mcp/chainlit/.chainlit/config.toml)
51+
:::
52+
53+
There's some glue code to get MCP Servers working with Chainlit, so we'll need to
54+
run this command to launch Chainlit instead:
55+
56+
```sh
57+
uv run --with anthropic --with chainlit chainlit run chat_mcp.py -w -h
58+
```
59+
60+
To add the MCP Server, click on the plug icon in the chat interface, and then
61+
add the following command to connect to use the ClickHouse SQL Playground:
62+
63+
```sh
64+
CLICKHOUSE_HOST=sql-clickhouse.clickhouse.com CLICKHOUSE_USER=demo CLICKHOUSE_PASSWORD= CLICKHOUSE_SECURE=true uv run --with mcp-clickhouse --python 3.13 mcp-clickhouse
65+
```
66+
67+
If you want to use your own ClickHouse instance, you can adjust the values of
68+
the environment variables.
69+
70+
You can then ask it questions like this:
71+
72+
* Tell me about the tables that you have to query
73+
* What's something interesting about New York taxis?
Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
---
2+
slug: /use-cases/AI/MCP/ai-agent-libraries/copilotkit
3+
sidebar_label: 'Integrate CopilotKit'
4+
title: 'How to build an AI Agent with CopilotKit and the ClickHouse MCP Server'
5+
pagination_prev: null
6+
pagination_next: null
7+
description: 'Learn how to build an agentic application using data stored in ClickHouse with ClickHouse MCP and CopilotKit'
8+
keywords: ['ClickHouse', 'MCP', 'copilotkit']
9+
show_related_blogs: true
10+
---
11+
12+
# CopilotKit and the ClickHouse MCP Server
13+
14+
This is an example of how to build an agentic application using data stored in
15+
ClickHouse. It uses the [ClickHouse MCP Server](https://github.com/ClickHouse/mcp-clickhouse)
16+
to query data from ClickHouse and generate charts based on the data.
17+
18+
[CopilotKit](https://github.com/CopilotKit/CopilotKit) is used to build the UI
19+
and provide a chat interface to the user.
20+
21+
:::note Example code
22+
The code for this example can be found in the [examples repository](https://github.com/ClickHouse/examples/edit/main/ai/mcp/copilotkit).
23+
:::
24+
25+
## Prerequisites {#prerequisites}
26+
27+
- `Node.js >= 20.14.0`
28+
- `uv >= 0.1.0`
29+
30+
## Install dependencies {#install-dependencies}
31+
32+
Clone the project locally: `git clone https://github.com/ClickHouse/examples` and
33+
navigate to the `ai/mcp/copilotkit` directory.
34+
35+
Skip this section and run the script `./install.sh` to install dependencies. If
36+
you want to install dependencies manually, follow the instructions below.
37+
38+
## Install dependencies manually {#install-dependencies-manually}
39+
40+
1. Install dependencies:
41+
42+
Run `npm install` to install node dependencies.
43+
44+
2. Install mcp-clickhouse:
45+
46+
Create a new folder `external` and clone the mcp-clickhouse repository into it.
47+
48+
```sh
49+
mkdir -p external
50+
git clone https://github.com/ClickHouse/mcp-clickhouse external/mcp-clickhouse
51+
```
52+
53+
Install Python dependencies and add fastmcp cli tool.
54+
55+
```sh
56+
cd external/mcp-clickhouse
57+
uv sync
58+
uv add fastmcp
59+
```
60+
61+
## Configure the application {#configure-the-application}
62+
63+
Copy the `env.example` file to `.env` and edit it to provide your `ANTHROPIC_API_KEY`.
64+
65+
## Use your own LLM {#use-your-own-llm}
66+
67+
If you'd rather use another LLM provider than Anthropic, you can modify the
68+
Copilotkit runtime to use a different LLM adapter.
69+
[Here](https://docs.copilotkit.ai/guides/bring-your-own-llm) is a list of supported
70+
providers.
71+
72+
## Use your own ClickHouse cluster {#use-your-own-clickhouse-cluster}
73+
74+
By default, the example is configured to connect to the
75+
[ClickHouse demo cluster](https://sql.clickhouse.com/). You can also use your
76+
own ClickHouse cluster by setting the following environment variables:
77+
78+
- `CLICKHOUSE_HOST`
79+
- `CLICKHOUSE_PORT`
80+
- `CLICKHOUSE_USER`
81+
- `CLICKHOUSE_PASSWORD`
82+
- `CLICKHOUSE_SECURE`
83+
84+
# Run the application {#run-the-application}
85+
86+
Run `npm run dev` to start the development server.
87+
88+
You can test the Agent using prompt like:
89+
90+
> "Show me the price evolution in
91+
Manchester for the last 10 years."
92+
93+
Open [http://localhost:3000](http://localhost:3000) with your browser to see
94+
the result.

0 commit comments

Comments
 (0)