This project is forked from coleam00/mcp-mem0
A template implementation of the Model Context Protocol (MCP) server integrated with Memobase for providing AI agents with persistent memory capabilities.
Use this as a reference point to build your MCP servers yourself, or give this as an example to an AI coding assistant and tell it to follow this example for structure and code correctness!
To run this mcp, you need to have your own Memobase backend:
- You can deploy a local one
- Or use free credits of Memobase Cloud
You should have:
- A project url. (local:
http://localhost:8019, cloudhttps://api.memobase.dev) - A project token. (local:
secret, cloudsk-proj-xxxxxx)
This project demonstrates how to build an MCP server that enables AI agents to store, retrieve, and search memories using semantic search. It serves as a practical template for creating your own MCP servers, simply using Memobase and a practical example.
The implementation follows the best practices laid out by Anthropic for building MCP servers, allowing seamless integration with any MCP-compatible client.
The server provides three essential memory management tools:
save_memory: Store any information in long-term memory with semantic indexingget_user_profiles: Retrieve complete user profilessearch_memories: Find relevant context for a given query
- Python 3.11+
-
Install uv if you don't have it:
pip install uv
-
Clone the repository:
git clone https://github.com/memodb-io/memobase
-
Navigate to the project directory:
cd memobase/src/mcp -
Install dependencies:
uv pip install -e . -
Create a
.envfile based on.env.example:cp .env.example .env
-
Configure your environment variables in the
.envfile (see Configuration section)
-
Build the Docker image:
docker build -t memobase-mcp --build-arg PORT=8050 . -
Create a
.envfile based on.env.exampleand configure your environment variables
The following environment variables can be configured in your .env file:
| Variable | Description | Example |
|---|---|---|
TRANSPORT |
Transport protocol (sse or stdio) | sse |
HOST |
Host to bind to when using SSE transport | 0.0.0.0 |
PORT |
Port to listen on when using SSE transport | 8050 |
MEMOBASE_API_KEY |
Memobase API key | secret |
MEMOBASE_BASE_URL |
Memobase base URL | http://localhost:8019 |
# Set TRANSPORT=sse in .env then:
uv run src/main.pyThe MCP server will essentially be run as an API endpoint that you can then connect to with config shown below.
docker run --env-file .env -p:8050:8050 memobase-mcpThe MCP server will essentially be run as an API endpoint within the container that you can then connect to with config shown below.
Once you have the server running with SSE transport, you can connect to it using this configuration (edit this in .cursor/mcp.json):
{
"mcpServers": {
"memobase": {
"transport": "sse",
"url": "http://localhost:8050/sse"
}
}
}Note for Windsurf users: Use
serverUrlinstead ofurlin your configuration:{ "mcpServers": { "memobase": { "transport": "sse", "serverUrl": "http://localhost:8050/sse" } } }
Note for n8n users: Use host.docker.internal instead of localhost since n8n has to reach outside of it's own container to the host machine:
So the full URL in the MCP node would be: http://host.docker.internal:8050/sse
Make sure to update the port if you are using a value other than the default 8050.
Add this server to your MCP configuration for Claude Desktop, Windsurf, or any other MCP client:
{
"mcpServers": {
"memobase": {
"command": "your/path/to/mcp/.venv/Scripts/python.exe",
"args": ["your/path/to/mcp/src/main.py"],
"env": {
"TRANSPORT": "stdio",
"MEMOBASE_API_KEY": "YOUR-API-KEY",
"MEMOBASE_BASE_URL": "YOUR-MEMOBASE-URL",
}
}
}
}{
"mcpServers": {
"memobase": {
"command": "docker",
"args": ["run", "--rm", "-i",
"-e", "TRANSPORT",
"-e", "MEMOBASE_API_KEY",
"-e", "MEMOBASE_BASE_URL",
"memobase-mcp"],
"env": {
"TRANSPORT": "stdio",
"MEMOBASE_API_KEY": "YOUR-API-KEY",
"MEMOBASE_BASE_URL": "https://api.memobase.io",
}
}
}
}This template provides a foundation for building more complex MCP servers. To build your own:
- Add your own tools by creating methods with the
@mcp.tool()decorator - Create your own lifespan function to add your own dependencies (clients, database connections, etc.)
- Modify the
utils.pyfile for any helper functions you need for your MCP server - Feel free to add prompts and resources as well with
@mcp.resource()and@mcp.prompt()