First off, thank you for considering contributing to Jan. It's people like you that make Jan such an amazing project.
Jan is an AI assistant that can run 100% offline on your device. Think ChatGPT, but private, local, and under your complete control. If you're thinking about contributing, you're already awesome - let's make AI accessible to everyone, one commit at a time.
- Web App - React UI and logic
- Core SDK - TypeScript SDK and extension system
- Extensions - Supportive modules for the frontend
- Tauri Backend - Rust native integration
- Tauri Plugins - Hardware and system plugins
Jan is a desktop app that runs local AI models. Here's how the components actually connect:
┌──────────────────────────────────────────────────────────┐
│ Web App (Frontend) │
│ (web-app/) │
│ • React UI │
│ • Chat Interface │
│ • Settings Pages │
│ • Model Hub │
└────────────┬─────────────────────────────┬───────────────┘
│ │
│ imports │ imports
▼ ▼
┌──────────────────────┐ ┌──────────────────────┐
│ Core SDK │ │ Extensions │
│ (core/) │ │ (extensions/) │
│ │ │ │
│ • TypeScript APIs │◄─────│ • Assistant Mgmt │
│ • Extension System │ uses │ • Conversations │
│ • Event Bus │ │ • Downloads │
│ • Type Definitions │ │ • LlamaCPP │
└──────────┬───────────┘ └───────────┬──────────┘
│ │
│ ┌──────────────────────┐ │
│ │ Web App │ │
│ └──────────┬───────────┘ │
│ │ │
└──────────────┼───────────────┘
│
▼
Tauri IPC
(invoke commands)
│
▼
┌───────────────────────────────────────────────────────────┐
│ Tauri Backend (Rust) │
│ (src-tauri/) │
│ │
│ • Window Management • File System Access │
│ • Process Control • System Integration │
│ • IPC Command Handler • Security & Permissions │
└───────────────────────────┬───────────────────────────────┘
│
│
▼
┌───────────────────────────────────────────────────────────┐
│ Tauri Plugins (Rust) │
│ (src-tauri/plugins/) │
│ │
│ ┌──────────────────┐ ┌──────────────────┐ │
│ │ Hardware Plugin │ │ LlamaCPP Plugin │ │
│ │ │ │ │ │
│ │ • CPU/GPU Info │ │ • Process Mgmt │ │
│ │ • Memory Stats │ │ • Model Loading │ │
│ │ • System Info │ │ • Inference │ │
│ └──────────────────┘ └──────────────────┘ │
└───────────────────────────────────────────────────────────┘
-
JavaScript Layer Relationships:
- Web App imports Core SDK and Extensions as JavaScript modules
- Extensions use Core SDK for shared functionality
- All run in the browser/webview context
-
All Three → Backend: Through Tauri IPC
- Web App → Backend:
await invoke('app_command', data) - Core SDK → Backend:
await invoke('core_command', data) - Extensions → Backend:
await invoke('ext_command', data) - Each component can independently call backend commands
- Web App → Backend:
-
Backend → Plugins: Native Rust integration
- Backend loads plugins as Rust libraries
- Direct function calls, no IPC overhead
-
Response Flow:
- Plugin → Backend → IPC → Requester (Web App/Core/Extension) → UI updates
Here's what actually happens when you click "Download Llama 3":
- Web App (
web-app/) - User clicks download button - Extension (
extensions/download-extension) - Handles the download logic - Tauri Backend (
src-tauri/) - Actually downloads the file to disk - Extension (
extensions/llamacpp-extension) - Prepares model for loading - Tauri Plugin (
src-tauri/plugins/llamacpp) - Starts llama.cpp process - Hardware Plugin (
src-tauri/plugins/hardware) - Detects GPU, optimizes settings - Model ready! - User can start chatting
jan/
├── web-app/ # React frontend (what users see)
├── src-tauri/ # Rust backend (system integration)
│ ├── src/core/ # Core Tauri commands
│ └── plugins/ # Tauri plugins (hardware, llamacpp)
├── core/ # TypeScript SDK (API layer)
├── extensions/ # JavaScript extensions
│ ├── assistant-extension/
│ ├── conversational-extension/
│ ├── download-extension/
│ └── llamacpp-extension/
├── docs/ # Documentation website
├── website/ # Marketing website
├── autoqa/ # Automated testing
├── scripts/ # Build utilities
│
├── package.json # Root workspace configuration
├── Makefile # Build automation commands
├── LICENSE # Apache 2.0 license
└── README.md # Project overview
Prerequisites:
- Node.js ≥ 20.0.0
- Yarn ≥ 1.22.0
- Rust (for Tauri)
- Make ≥ 3.81
Option 1: The Easy Way (Make)
git clone https://github.com/janhq/jan
cd jan
make dev- Ensure the bug was not already reported by searching on GitHub under Issues
- If you're unable to find an open issue addressing the problem, open a new one
- Include your system specs and error logs - it helps a ton
- Open a new issue with a clear title and description
- Explain why this enhancement would be useful
- Include mockups or examples if you can
Choose Your Adventure:
- Frontend UI and logic →
web-app/ - Shared API declarations →
core/ - Backend system integration →
src-tauri/ - Business logic features →
extensions/ - Dedicated backend handler →
src-tauri/plugins/
The Process:
- Fork the repo
- Create a new branch (
git checkout -b feature-name) - Make your changes (and write tests!)
- Commit your changes (
git commit -am 'Add some feature') - Push to the branch (
git push origin feature-name) - Open a new Pull Request against
devbranch
yarn test # All tests
cd src-tauri && cargo test # Rust tests
cd autoqa && python main.py # End-to-end tests- TypeScript required (we're not animals)
- ESLint + Prettier
- Functional React components
- Proper typing (no
any- seriously!)
cargo fmt+cargo clippyResult<T, E>for error handling- Document public APIs
main- stable releasesdev- development (target this for PRs)feature/*- new featuresfix/*- bug fixes
- Use the present tense ("Add feature" not "Added feature")
- Be descriptive but concise
- Reference issues when applicable
Examples:
feat: add support for Qwen models
fix: resolve memory leak in model loading
docs: update installation instructions
If things go sideways:
- Check our troubleshooting docs
- Clear everything and start fresh:
make cleanthenmake dev - Copy your error logs and system specs
- Ask for help in our Discord
#🆘|jan-helpchannel
Common issues:
- Build failures: Check Node.js and Rust versions
- Extension not loading: Verify it's properly registered
- Model not working: Check hardware requirements and GPU drivers
- Documentation - The manual you should read
- Discord Community - Where the community lives
- GitHub Issues - Report bugs here
- GitHub Discussions - Ask questions
Apache 2.0 - Because sharing is caring. See LICENSE for the legal stuff.
We're building something pretty cool here - an AI assistant that respects your privacy and runs entirely on your machine. Every contribution, no matter how small, helps make AI more accessible to everyone.
Thanks for being part of the journey. Let's build the future of local AI together! 🚀