A local-first AI chat client built with Tauri 2.0 and Next.js
English | 中文
chatless is a desktop AI chat client that supports multiple cloud-based AI services (OpenAI, Anthropic, DeepSeek, etc.) and local models (Ollama, LM Studio). All data is stored locally with support for document parsing and vector retrieval (RAG).
Key Features:
- Multi-AI Provider Support - Cloud services and local models
- Document Parsing - PDF, Word, Markdown formats
- Image Analysis - Vision model support for image understanding
- Local RAG Knowledge Base - Vector retrieval for improved accuracy
- MCP Protocol Integration - Extend with third-party tools
- Prompt Management - Quick access to commonly used prompts
- WebDAV Sync (Prompts) - Multi-device prompt sync (JSON-Chunk)
- Local Data Storage - Privacy protection
For more feature demonstrations, visit the complete documentation
Download the installation package for your system from GitHub Releases:
- Windows -
.exeor.msiinstaller - macOS - Choose
aarch64(Apple Silicon) orx64(Intel) version - Linux -
.deb,.rpm, or.AppImageformat
For detailed installation steps, refer to the Installation Guide
- Open the app and go to Settings
- Add API keys for AI providers
- (Optional) Configure Ollama for local models
- Start chatting
Configure WebDAV in Settings → Sync:
- Remote structure:
{basePath}/prompts/data/{id}.jsonand.../metadata.json - Deletion: remote files are not deleted;
deleted_attombstones are synced
For complete usage instructions, visit the User Documentation
# Clone repository
git clone https://github.com/kamjin3086/chatless.git
cd chatless
# Install dependencies
pnpm install
# Start development server
pnpm tauri dev
# Build application
pnpm tauri buildTech Stack: Tauri 2.0, Next.js 15, TypeScript, Rust, SQLite
For more development information, see Contributing Guide
| Documentation | Link |
|---|---|
| User Guide | View Docs |
| Quick Start | Quick Start |
| Features | Feature Details |
| FAQ | FAQ |
| Roadmap | Roadmap |
Q: App crashes immediately on Windows startup?
Some Windows environments require Microsoft Visual C++ runtime. Download and install:
For more solutions, see FAQ Documentation
- Bug Reports - GitHub Issues
- Feature Discussions - GitHub Discussions
- In-app Feedback - Settings → Feedback
Issues and Pull Requests are welcome. Please read the Contributing Guide before contributing.
MIT License © 2025 chatless
Thanks to the following open source projects:
- Tauri - Cross-platform desktop application framework
- Next.js - React full-stack framework
- Ollama - Local large language model runtime
- ort - ONNX Runtime Rust binding
Thanks to community contributors @ukhack, @duokebei for testing and feedback.
If this project helps you, please give it a Star ⭐
