askimo.chat · Local-first AI agent platform: chat, search, run, automate.
📥 Download • 📖 Documentation • 💬 Discussions • ⭐ Star on GitHub
Askimo is a local-first AI desktop app and CLI. It connects AI models to your local files, tools, and workflows without routing data through a cloud service.
It supports multiple providers (OpenAI, Claude, Gemini, Ollama, and others), persistent chat sessions backed by SQLite, document and code search via hybrid RAG (BM25 + vector), MCP tool integration, and a script runner that executes Python, Bash, and JavaScript inline. All state lives on disk.
- Multi-provider - Configure and switch between cloud and local AI providers per session. Supported: OpenAI, Anthropic, Google Gemini, xAI Grok, Ollama, LM Studio, LocalAI, Docker AI, and any OpenAI-compatible endpoint
- Persistent sessions - Conversations are stored in a local SQLite database and restored on restart
- RAG - Index local folders, files, and web URLs. Uses hybrid BM25 + vector retrieval with an AI-based classifier that skips retrieval when the query does not need it
- Script runner - Execute Python, Bash, and JavaScript directly from chat. Python scripts run in an auto-managed virtualenv with automatic dependency installation
- Vision - Attach images to conversations; works with any multimodal model
- MCP tool integration - Connect MCP-compatible servers via stdio or HTTP, scoped globally or per project
- Recipe automation (CLI) - Define prompt templates in YAML with variables, file I/O, and post-actions. Run with
askimo -r <recipe> [args] - Local telemetry - Tracks token usage, estimated cost, and RAG performance per provider. Nothing is uploaded
- i18n - UI available in English, Chinese (Simplified and Traditional), Japanese, Korean, French, Spanish, German, Portuguese, and Vietnamese
Cloud: OpenAI · Anthropic Claude · Google Gemini · xAI Grok
Local: Ollama · LM Studio · LocalAI · Docker AI
Works with any OpenAI-compatible endpoint via custom base URL.
Download for macOS, Windows, or Linux →
After installation, open Askimo, configure a provider (API key for cloud models, or point it at a running Ollama instance), and start a session. Setup guide →
- Memory: 50-300 MB (AI models require additional memory depending on the provider)
- OS: macOS 11+, Windows 10+, Linux (Ubuntu 20.04+, Debian 11+, Fedora 35+)
- Disk: 250 MB for the application
RAG:
Script runner:
MCP tools:
Askimo also ships as a native CLI binary built with GraalVM. Useful for scripting, automation, and headless environments.
# macOS/Linux
curl -sSL https://raw.githubusercontent.com/haiphucnguyen/askimo/main/tools/installation/install.sh | bash
# Windows (PowerShell)
iwr -useb https://raw.githubusercontent.com/haiphucnguyen/askimo/main/tools/installation/install.ps1 | iex- JDK 21+
- Git
git clone https://github.com/haiphucnguyen/askimo.git
cd askimo
# Run the desktop app
./gradlew :desktop:run
# Build native installers
./gradlew :desktop:package
# Build CLI native binary (requires GraalVM)
./gradlew :cli:nativeImagedesktop/- Compose Multiplatform desktop applicationdesktop-shared/- Shared UI components usable across productscli/- JLine3 REPL and GraalVM native imageshared/- Core logic: providers, RAG, MCP, memory, tools, database
See CONTRIBUTING.md for development guidelines, code style, and DCO requirements.
For full developer documentation, see the Development Getting Started Guide.
UI is available in: English, Chinese (Simplified and Traditional), Japanese, Korean, French, Spanish, German, Portuguese, Vietnamese.
Want to add a language? Open a discussion.
Bug reports, feature requests, and pull requests are welcome. See CONTRIBUTING.md for details.
AGPLv3. See LICENSE.





