This app is a Bun + Express document agent that uses Redis for:
- Session storage
- Short-term chat history
- Long-term, episodic, and semantic memory
- JSON document storage
- Vector search over document chunks
- Redis Streams-backed logging
By default, the app runs in a deterministic local demo mode that loads bundled markdown docs from data/documents into Redis. You can optionally switch to live crawl mode with Tavily.
- Bun
- Docker for the bundled Redis service
- One LLM provider key
- Recommended: OpenAI
- Optional: Google Vertex AI or Anthropic
- Create a local env file:
cp .env.example .env- Install dependencies:
bun install- Start Redis in Docker:
bun run docker:redis- Start the app:
bun run devOpen http://localhost:8080.
This is the default mode.
CRAWL_SOURCE=local- The app loads bundled docs from
data/documents - No Tavily key is required
- Best fit for the tutorial flow and local testing
Switch to Tavily-backed crawl when you want live document ingestion.
CRAWL_SOURCE=tavily
TAVILY_API_KEY=...In live crawl mode, the app uses the URL and crawl instructions extracted from the project prompt.
bun run dev # app + CSS watcher
bun run test # test suite
bun run ts # TypeScript checks
bun run build # production build
bun run format # Prettier
bun run docker # Redis + app in Docker
bun run docker:redisThe bundled Docker setup uses:
redis:alpine- Port
6300on the host redis://redis:6379inside Compose
For local development outside Docker, the app reads REDIS_URL from .env.
You can also point the app at Redis Cloud by setting REDIS_URL to your Redis Cloud connection string.
Example:
REDIS_URL="redis://default:<password>@redis-xxxxx.region.provider.redns.redis-cloud.com:12345"- Create a project and provide a title plus a working brief.
- Load source docs into Redis.
- Chunk and embed those docs for vector search.
- Ask questions over the indexed docs.
- Edit markdown and reuse editing preferences through Redis-backed memory.