Search engine for your files!
Use it yourself or plug it into your agents. Comes with a CLI, TUI, and macOS app.
Choose which directories to index, and Cosma will process all files in those directories into a search-optimized index. It'll also watch for changes to keep the index updated.
After files are indexed, you can search for them with natural language! Cosma uses vector-powered search to find files quickly and easily.
Cosma can run 100% locally or in the cloud.
Cosma has been tested on MacOS ARM and Windows. Linux support is in progress.
Cosma can be downloaded from PyPI.
The installation commands below use uvx.sh to install Cosma
via uv. If you prefer using uv directly,
use uv tool install cosma.
curl -LsSf uvx.sh/cosma/install.sh | shpowershell -ExecutionPolicy ByPass -c "irm https://uvx.sh/cosma/install.ps1 | iex"To upgrade to the latest version:
uv tool upgrade cosma --no-cacheMake sure you have Ollama installed.
Cosma has a backend to serve search queries, so it must be started first. This needs to always be running to watch for file changes and process files in the background. For persistent use, consider running this as a background process or configure it as a system service.
cosma serveImportant
The backend must be running for the following commands to work (see above).
Cosma includes a CLI with three output modes:
- default - Rich formatted output with colors and tables for interactive use
--plain- Simple text output without colors, for piping and scripts--json- Structured JSON output for agents and programmatic access
cosma search "my query" # Search indexed files
cosma index /path/to/dir # Index a directory
cosma status # Check backend status
cosma watch add /path/to/dir # Watch directory for changes
cosma files stats # Get file statisticsRun cosma --help for all commands.
Cosma also comes with a terminal UI (TUI).
cosma tui /path/to/directory/to/searchWarning
This will begin processing all files in the directory specified, which will take some time if running locally.
Cosma is open source, and we'd love to have you contribute! Please feel free to open an issue or pull request with code changes, or join our Discord. We'll have documentation for how best to contribute soon!
