Skip to content

Commit 09ff123

Browse files
SaxonFcraigcannon
andauthored
Blog/natural db (supabase#36266)
* natural-db blog post * blog * less markety * replace use case * remove image * Update 2025-06-10-natural-db.mdx - updated title to be a bit catchier - edited intro copy for readability * update copy * update date * blog copy * update post * copy refinements * fix env variable * copy updates and date change * copy * copy * rss and date change * rss --------- Co-authored-by: Craig Cannon <[email protected]>
1 parent d122f28 commit 09ff123

File tree

5 files changed

+264
-1
lines changed

5 files changed

+264
-1
lines changed
Lines changed: 256 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,256 @@
1+
---
2+
title: 'Build a Personalized AI Assistant with Postgres'
3+
description: 'Learn how to build a Supabase powered AI assistant that combines PostgreSQL with scheduling and external tools for long-term memory, structured data management and autonomous actions.'
4+
categories:
5+
- product
6+
tags:
7+
- postgres
8+
- ai
9+
- personal-assistant
10+
date: '2025-06-25:00:00'
11+
toc_depth: 3
12+
author: saxon_fletcher
13+
image: 2025-06-10-natural-db/og.png
14+
thumb: 2025-06-10-natural-db/thumb.png
15+
---
16+
17+
Large Language Models are excellent at transforming unstructured text into structured data, but they face challenges when it comes to accurately retrieving that data over extended conversations. In this post, we'll leverage this core strength and combine it with Postgres, along with several complementary tools, to build a personalized AI assistant capable of long-term memory retention.
18+
19+
At a high level, the system's flexibility is created by combining these core building blocks: An LLM owned database schema through an execute_sql tool, scheduled tasks for autonomy, web searches for real-time information, and MCP integrations for extended actions that may integrate with external tools.
20+
21+
See it at work in the video below.
22+
23+
<video className="rounded-sm m-0" autoPlay loop muted>
24+
<source
25+
src="https://xguihxuzqibwxjnimxev.supabase.co/storage/v1/object/public/videos/marketing/blog/natural-db/natural-db-demo-combined.mp4"
26+
type="video/mp4"
27+
/>
28+
</video>
29+
30+
## Core Pieces
31+
32+
### Scoped Database Control
33+
34+
The assistant uses a dedicated Postgres schema called `memories` to store all of its structured data. To ensure security, the LLM operates under a specific role, `memories_role`, which is granted permissions only within this schema.
35+
36+
- **Scoped Schema**: The LLM can create tables, store data, and perform operations exclusively within the `memories` schema by calling an execute_sql tool
37+
- **System Table Protection**: All other schemas, including `public`, are inaccessible to the LLM.
38+
39+
### Messages Context
40+
41+
Three complementary memory types maintain conversation continuity:
42+
43+
- **Message History (Short-term Memory)**: Maintains a chronological list of recent messages for immediate context
44+
- **Semantic Memory (Vector Search using pgvector)**: Stores conversation embeddings using pgvector for fuzzy concept retrieval ("that productivity thing we talked about last month")
45+
- **Structured Memory (SQL Data)**: Stores concrete facts in LLM-created tables for precise queries ("How much did I spend on coffee last quarter?")
46+
47+
### Scheduled Prompts
48+
49+
The system achieves autonomy through scheduled prompts which are powered by pg_cron through a dedicated tool. Scheduled prompts call the same edge functions as a normal prompt via pg_net and can therefore use all the same tools.
50+
51+
**Example**: "Every Sunday at 6 PM, analyze my portfolio performance and research market trends"
52+
53+
1. A cron job executes the prompt every Sunday at 6 PM.
54+
2. The LLM retrieves data from relevant tables in your memories schema, like current portfolio holdings.
55+
3. Web search is triggered to find relevant market news and competitor analysis based on data
56+
4. Web search results are transformed into structured data and stored in your database
57+
5. Sends a personalized email report using Zapier MCP.
58+
6. Future queries like "How has my portfolio performed compared to market trends?" references this data
59+
60+
### Web Search
61+
62+
The system leverages built-in web search capabilities from LLMs like OpenAI's web search tool to access real-time information and current events.
63+
64+
```sql
65+
-- Auto-generated from web search results
66+
CREATE TABLE research_findings (
67+
topic TEXT,
68+
source_url TEXT,
69+
key_insights TEXT[],
70+
credibility_score INTEGER,
71+
search_date TIMESTAMPTZ DEFAULT NOW()
72+
);
73+
```
74+
75+
### Zapier MCP Integration
76+
77+
Through Zapier's MCP integration, your assistant can:
78+
79+
- Read/send emails (Gmail)
80+
- Manage calendar events
81+
- Update spreadsheets
82+
- Send notifications (Slack, Discord, SMS)
83+
- Create tasks (Trello, Asana, Notion)
84+
- Control smart home devices
85+
86+
### Input/Output Integration
87+
88+
The system uses a Telegram Bot as the default interface which calls an edge function via webhook. You can change this to whatever interface you want, for example a web page, voice or other.
89+
90+
### Self-Evolving System Prompt
91+
92+
The assistant maintains two behavioral layers:
93+
94+
- **Base Behavior**: Core functionality (database operations, scheduling, web search) remains consistent via a constant system prompt
95+
- **Personalized Behavior**: Communication style and preferences that evolve based on user feedback which can be changed via a dedicated tool and stored in a public.system_prompts table
96+
97+
When you say "be more formal" or "address me by name," these preferences are stored with version history and persist across all conversations, creating a personalized experience.
98+
99+
## Use Cases
100+
101+
### Run Tracking
102+
103+
![Run tracking dashboard showing activity history and statistics](/images/blog/2025-06-10-natural-db/runs.png)
104+
105+
**Prompt**: "Help me track my daily runs by sending me a reminder each morning with details on my previous days run"
106+
107+
1. LLM creates a `runs` table to store distance, duration, route, weather conditions, and personal notes for each run
108+
2. LLM also creates a cron job that fires daily
109+
3. Every morning a scheduled prompt is sent which triggers the LLM to query the runs table and send off a run reminder via Telegram with details
110+
4. User submits run details via Telegram which is stored in the runs table
111+
5. Opportunity for a monthly cron job that summaries running patterns, highlight achievements, and suggest training adjustments based on progress
112+
113+
### Personal Recipe & Meal Planning
114+
115+
**Prompt**: "Help me track my meals and suggest recipes based on what I have in my kitchen"
116+
117+
1. LLM creates `recipes`, `ingredients`, `meal_history`, and `meal_ratings` tables to store cooking experiences, dietary preferences, and meal satisfaction
118+
2. LLM also creates a cron job that fires daily
119+
3. Every morning a scheduled prompt is sent which triggers the LLM to query the meal_history table and suggest recipes based on available ingredients via Telegram
120+
4. User submits meal details and ratings via Telegram which is stored in the meal_history and meal_ratings tables
121+
5. Opportunity for a weekly cron job that analyzes cooking patterns, suggests grocery lists, and recommends new recipes based on preferences
122+
123+
### Company Feedback Analysis
124+
125+
**Prompt**: "Help me track customer feedback by analyzing support tickets daily and giving me weekly summaries"
126+
127+
1. LLM creates a `feedback` table to store ticket analysis, themes, sentiment scores, and product areas
128+
2. LLM also creates a cron job that fires daily
129+
3. Every morning a scheduled prompt is sent which triggers the LLM to fetch new tickets via MCP, analyze them, and store findings in the feedback table
130+
4. User receives daily feedback alerts via Telegram with key insights and ticket summaries
131+
5. Opportunity for a weekly cron job that generates comprehensive feedback reports, highlighting trends and actionable insights
132+
133+
### Interest-Based Article Bookmarker
134+
135+
**Prompt**: "Help me track interesting articles about AI and climate change, reminding me of important ones I haven't read"
136+
137+
1. LLM creates an `articles` table to store article metadata, read status, relevance scores, and user interests
138+
2. LLM also creates a cron job that fires daily
139+
3. Every morning a scheduled prompt is sent which triggers the LLM to search for new articles via web search, analyze relevance, and store them in the articles table
140+
4. User receives daily article recommendations via Telegram with personalized reading suggestions
141+
5. Opportunity for a weekly cron job that summarizes reading patterns, highlights must-read articles, and suggests new topics based on interests
142+
143+
## Implementation Guide
144+
145+
### Prerequisites
146+
147+
- Supabase account (free tier sufficient)
148+
- OpenAI API key
149+
- Telegram bot token
150+
- Zapier account (optional)
151+
152+
### Optional: Using the CLI
153+
154+
If you prefer the command line, you can use the Supabase CLI to set up your database and Edge Functions. This replaces **Step 1** and **Step 2**.
155+
156+
1. **Clone the repository**.
157+
```bash
158+
git clone https://github.com/supabase-community/natural-db.git
159+
cd natural-db
160+
```
161+
2. **Log in to the Supabase CLI and link your project**.
162+
Create a new project on the [Supabase Dashboard](https://supabase.com/dashboard), then run:
163+
```bash
164+
supabase login
165+
supabase link --project-ref <YOUR-PROJECT-ID>
166+
```
167+
3. **Push the database schema**.
168+
```bash
169+
supabase db push
170+
```
171+
4. **Deploy Edge Functions**.
172+
```bash
173+
supabase functions deploy --no-verify-jwt
174+
```
175+
176+
After completing these steps, you can proceed to **Step 3: Telegram Bot**.
177+
178+
### Step 1: Database Setup
179+
180+
Run the migration SQL in your Supabase SQL editor: [migration.sql](https://github.com/supabase-community/natural-db/blob/main/supabase/migrations/001_create_initial_schema.sql)
181+
182+
- Sets up required extensions like `pgvector` and `pg_cron`.
183+
- Creates the `memories` schema for the assistant's data.
184+
- Creates the `memories_role` with scoped permissions to the `memories` schema.
185+
- Configures cron job scheduling.
186+
187+
### Step 2: Edge Functions
188+
189+
Create three functions in Supabase dashboard:
190+
191+
**natural-db**: Main AI brain handling all processing, database operations, scheduling, and tool integration
192+
193+
- [natural-db/index.ts](https://github.com/supabase-community/natural-db/blob/main/supabase/functions/natural-db/index.ts)
194+
- [natural-db/db-utils.ts](https://github.com/supabase-community/natural-db/blob/main/supabase/functions/natural-db/db-utils.ts)
195+
- [natural-db/tools.ts](https://github.com/supabase-community/natural-db/blob/main/supabase/functions/natural-db/tools.ts)
196+
197+
**telegram-input**: Webhook handler for incoming messages with user validation and timezone management
198+
199+
- [telegram-input/index.ts](https://github.com/supabase-community/natural-db/blob/main/supabase/functions/telegram-input/index.ts)
200+
201+
**telegram-outgoing**: Response formatter and delivery handler with error management
202+
203+
- [telegram-outgoing/index.ts](https://github.com/supabase-community/natural-db/blob/main/supabase/functions/telegram-outgoing/index.ts)
204+
205+
### Step 3: Telegram Bot
206+
207+
1. Create bot via [@BotFather](https://t.me/botfather)
208+
2. Set webhook: `https://api.telegram.org/bot[TOKEN]/setWebhook?url=https://[PROJECT].supabase.co/functions/v1/telegram-input`
209+
210+
### Step 4: Environment Variables
211+
212+
Set the following environment variables in your Supabase project settings (Project Settings → Edge Functions):
213+
214+
##### Required Variables:
215+
216+
- `OPENAI_API_KEY`: Your OpenAI API key
217+
- `TELEGRAM_BOT_TOKEN`: Bot token from @BotFather
218+
- `ALLOWED_USERNAMES`: Comma-separated list of allowed Telegram usernames
219+
- `TELEGRAM_WEBHOOK_SECRET`: Secret token for webhook validation
220+
221+
##### Optional Variables:
222+
223+
- `OPENAI_MODEL`: OpenAI model to use (defaults to "gpt-4.1-mini")
224+
- `ZAPIER_MCP_URL`: MCP server URL for Zapier integrations
225+
226+
### Step 5: Test Integration
227+
228+
Try these commands with your bot:
229+
230+
- "Store my grocery budget as $400 monthly"
231+
- "What's the weather today?" (web search)
232+
- "Remind me to exercise every Monday at 7 AM"
233+
- "Be more enthusiastic when I discuss hobbies" (personality)
234+
235+
## Cost Considerations
236+
237+
Based on 10 messages per day (300 messages/month):
238+
239+
- **Supabase**: Free tier (500MB database, 5GB bandwidth) - $0/month
240+
- **OpenAI GPT-4.1-mini**: $0.40 per 1M input tokens, $1.60 per 1M output tokens
241+
- Average 1200 input + 800 output tokens per message
242+
- Input: 300 messages × 1200 tokens × $0.40/1M = $0.144/month
243+
- Output: 300 messages × 800 tokens × $1.60/1M = $0.384/month
244+
- Total OpenAI: $0.53/month
245+
- **Telegram**: Free API usage
246+
- **Zapier**: Free tier (300 tasks/month) - $0/month
247+
- **Vector Embeddings**: $0.02 per 1M tokens (text-embedding-3-small)
248+
- 300 messages × 1200 tokens × $0.02/1M = $0.0072/month
249+
250+
**Total monthly cost: ~$0.54**
251+
252+
## Make it your own
253+
254+
This project showcases how combining modular components—with LLMs as just one piece—can create systems that are greater than the sum of their parts. I hope this inspires you to build and deploy your own personalized AI assistant while maintaining full control over your code and data. For additional inspiration, check out [this excellent post by Geoffrey Litt](https://www.geoffreylitt.com/2025/04/12/how-i-made-a-useful-ai-assistant-with-one-sqlite-table-and-a-handful-of-cron-jobs).
255+
256+
Ready to build your own AI assistant? Check out the [GitHub repository](https://github.com/supabase-community/natural-db) to get started, contribute improvements, or share your own use cases.
297 KB
Loading
455 KB
Loading
297 KB
Loading

apps/www/public/rss.xml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,16 @@
55
<link>https://supabase.com</link>
66
<description>Latest news from Supabase</description>
77
<language>en</language>
8-
<lastBuildDate>Tue, 10 Jun 2025 00:00:00 -0700</lastBuildDate>
8+
<lastBuildDate>Wed, 25 Jun 2025 00:00:00 -0700</lastBuildDate>
99
<atom:link href="https://supabase.com/rss.xml" rel="self" type="application/rss+xml"/>
1010
<item>
11+
<guid>https://supabase.com/blog/natural-db</guid>
12+
<title>Build a Personalized AI Assistant with Postgres</title>
13+
<link>https://supabase.com/blog/natural-db</link>
14+
<description>Learn how to build a Supabase powered AI assistant that combines PostgreSQL with scheduling and external tools for long-term memory, structured data management and autonomous actions.</description>
15+
<pubDate>Wed, 25 Jun 2025 00:00:00 -0700</pubDate>
16+
</item>
17+
<item>
1118
<guid>https://supabase.com/blog/multigres-vitess-for-postgres</guid>
1219
<title>Announcing Multigres: Vitess for Postgres</title>
1320
<link>https://supabase.com/blog/multigres-vitess-for-postgres</link>

0 commit comments

Comments
 (0)