First, you'll need to download this repository. After you've downloaded it, you can install and run the servers.
This project uses uv for dependency management.
- Install uv:
pip install uv- Install the package and required dependencies:
uv sync- Set up environment variables (see Configuration section)
The easiest way to start the worker, REST API server, and MCP server is to use Docker Compose. See the Docker Compose section below for more details.
But you can also run these components via the CLI commands. Here's how you run the REST API server:
# Development mode (no separate worker needed, asyncio backend)
uv run agent-memory api --task-backend asyncio
# Production mode (default Docket backend; requires separate worker process)
uv run agent-memory apiOr the MCP server:
# Stdio mode (recommended for Claude Desktop)
uv run agent-memory mcp
# SSE mode for development
uv run agent-memory mcp --mode sse
# Streamable HTTP mode for network deployments
uv run agent-memory mcp --mode streamable-http --port 9000
# SSE mode for production (use Docket backend)
uv run agent-memory mcp --mode sse --task-backend docket| Command | Typical Use | Backend Behavior |
|---|---|---|
uv run agent-memory api --task-backend=asyncio |
Local development (single process) | Uses asyncio inline tasks; no separate worker |
uv run agent-memory api |
Production API server | Defaults to docket; run uv run agent-memory task-worker |
uv run agent-memory mcp |
Claude Desktop / local stdio MCP | Defaults to asyncio; no worker required |
uv run agent-memory mcp --mode sse --port 9000 --task-backend docket |
Network MCP with shared workers | Uses docket; run uv run agent-memory task-worker |
uv run agent-memory task-worker --concurrency 10 |
Background processing | Processes queued Docket tasks |
When configuring MCP-enabled apps (e.g., Claude Desktop), prefer uvx so the app can run the server without a local checkout:
{
"mcpServers": {
"memory": {
"command": "uvx",
"args": ["--from", "agent-memory-server", "agent-memory", "mcp"],
"env": {
"DISABLE_AUTH": "true",
"REDIS_URL": "redis://localhost:6379",
"OPENAI_API_KEY": "<your-openai-key>"
}
}
}
}Notes:
- API keys: Default models use OpenAI. Set
OPENAI_API_KEY. To use Anthropic instead, setANTHROPIC_API_KEYand alsoGENERATION_MODELto an Anthropic model (e.g.,claude-3-5-haiku-20241022). If you have access to GPT-5 models, you can instead setGENERATION_MODELtogpt-5.2-chat-latest,gpt-5.1-chat-latest,gpt-5-mini, orgpt-5-nano. See LLM Providers for all supported providers. - Make sure your MCP host can find
uvx(on its PATH or by using an absolute command path). macOS:brew install uv. If not on PATH, set"command"to an absolute path (e.g.,/opt/homebrew/bin/uvxon Apple Silicon,/usr/local/bin/uvxon Intel macOS). - For production, remove
DISABLE_AUTHand configure auth.
For production deployments, you'll need to run a separate worker process:
uv run agent-memory task-workerFor development, the default --task-backend=asyncio on the mcp command runs tasks inline without needing a separate worker process. For the api command, use --task-backend=asyncio explicitly when you want single-process behavior.
NOTE: With uv, prefix the command with uv, e.g.: uv run agent-memory mcp --mode sse. If you installed from source, you'll probably need to add --directory to tell uv where to find the code: uv --directory <path/to/checkout> run agent-memory mcp --mode stdio.
To start the API using Docker Compose, follow these steps:
-
Ensure that Docker and Docker Compose are installed on your system.
-
Open a terminal in the project root directory (where the
docker-compose.ymlfile is located). -
(Optional) Set up your environment variables (such as
OPENAI_API_KEYandANTHROPIC_API_KEY) either in a.envfile or by modifyingdocker-compose.ymlas needed. -
Build and start the containers by running:
docker compose up --build
-
Once the containers are up, the REST API will be available at http://localhost:8000. You can also access the interactive API documentation at http://localhost:8000/docs. The MCP server will be available at http://localhost:9050/sse.
Note: In Docker Compose, MCP is mapped as
9050:9000, so you connect to port9050on the host. If you run MCP directly via CLI (without Compose), the default port is9000. -
To stop the containers, press Ctrl+C in the terminal and then run:
docker compose down