Skip to content

Latest commit

 

History

History
122 lines (84 loc) · 4.8 KB

File metadata and controls

122 lines (84 loc) · 4.8 KB

Getting Started

Installation

First, you'll need to download this repository. After you've downloaded it, you can install and run the servers.

This project uses uv for dependency management.

  1. Install uv:
pip install uv
  1. Install the package and required dependencies:
uv sync
  1. Set up environment variables (see Configuration section)

Running

The easiest way to start the worker, REST API server, and MCP server is to use Docker Compose. See the Docker Compose section below for more details.

But you can also run these components via the CLI commands. Here's how you run the REST API server:

# Development mode (no separate worker needed, asyncio backend)
uv run agent-memory api --task-backend asyncio

# Production mode (default Docket backend; requires separate worker process)
uv run agent-memory api

Or the MCP server:

# Stdio mode (recommended for Claude Desktop)
uv run agent-memory mcp

# SSE mode for development
uv run agent-memory mcp --mode sse

# Streamable HTTP mode for network deployments
uv run agent-memory mcp --mode streamable-http --port 9000

# SSE mode for production (use Docket backend)
uv run agent-memory mcp --mode sse --task-backend docket

Core CLI Commands

Command Typical Use Backend Behavior
uv run agent-memory api --task-backend=asyncio Local development (single process) Uses asyncio inline tasks; no separate worker
uv run agent-memory api Production API server Defaults to docket; run uv run agent-memory task-worker
uv run agent-memory mcp Claude Desktop / local stdio MCP Defaults to asyncio; no worker required
uv run agent-memory mcp --mode sse --port 9000 --task-backend docket Network MCP with shared workers Uses docket; run uv run agent-memory task-worker
uv run agent-memory task-worker --concurrency 10 Background processing Processes queued Docket tasks

Using uvx in MCP clients

When configuring MCP-enabled apps (e.g., Claude Desktop), prefer uvx so the app can run the server without a local checkout:

{
  "mcpServers": {
    "memory": {
      "command": "uvx",
      "args": ["--from", "agent-memory-server", "agent-memory", "mcp"],
      "env": {
        "DISABLE_AUTH": "true",
        "REDIS_URL": "redis://localhost:6379",
        "OPENAI_API_KEY": "<your-openai-key>"
      }
    }
  }
}

Notes:

  • API keys: Default models use OpenAI. Set OPENAI_API_KEY. To use Anthropic instead, set ANTHROPIC_API_KEY and also GENERATION_MODEL to an Anthropic model (e.g., claude-3-5-haiku-20241022). If you have access to GPT-5 models, you can instead set GENERATION_MODEL to gpt-5.2-chat-latest, gpt-5.1-chat-latest, gpt-5-mini, or gpt-5-nano. See LLM Providers for all supported providers.
  • Make sure your MCP host can find uvx (on its PATH or by using an absolute command path). macOS: brew install uv. If not on PATH, set "command" to an absolute path (e.g., /opt/homebrew/bin/uvx on Apple Silicon, /usr/local/bin/uvx on Intel macOS).
  • For production, remove DISABLE_AUTH and configure auth.

For production deployments, you'll need to run a separate worker process:

uv run agent-memory task-worker

For development, the default --task-backend=asyncio on the mcp command runs tasks inline without needing a separate worker process. For the api command, use --task-backend=asyncio explicitly when you want single-process behavior.

NOTE: With uv, prefix the command with uv, e.g.: uv run agent-memory mcp --mode sse. If you installed from source, you'll probably need to add --directory to tell uv where to find the code: uv --directory <path/to/checkout> run agent-memory mcp --mode stdio.

Docker Compose

To start the API using Docker Compose, follow these steps:

  1. Ensure that Docker and Docker Compose are installed on your system.

  2. Open a terminal in the project root directory (where the docker-compose.yml file is located).

  3. (Optional) Set up your environment variables (such as OPENAI_API_KEY and ANTHROPIC_API_KEY) either in a .env file or by modifying docker-compose.yml as needed.

  4. Build and start the containers by running:

    docker compose up --build
  5. Once the containers are up, the REST API will be available at http://localhost:8000. You can also access the interactive API documentation at http://localhost:8000/docs. The MCP server will be available at http://localhost:9050/sse.

    Note: In Docker Compose, MCP is mapped as 9050:9000, so you connect to port 9050 on the host. If you run MCP directly via CLI (without Compose), the default port is 9000.

  6. To stop the containers, press Ctrl+C in the terminal and then run:

    docker compose down