⚠️ This project is an early prototype. All APIs are subject to change.
An extensible, AI-powered, command-line coding assistant.
Matt Code is built on oclif. This allows it to use oclif's plugin system to add functionality to the application.
Matt Code extends the concept of an oclif plugin by allowing them to define providers. Plugins may implement these providers to add functionality to Matt Code.
For example, the ToolProvider allows a plugin to define tools that that AI may call. One such example can be found in the just-bash plugin which exposes Vercel's just-bash to provide the Matt Code agent access to the filesystem.
-
Start the local Ollama server so the CLI can reach it at http://localhost:11434:
ollama serve
-
Pull Qwen3-Coder:
ollama pull qwen3-coder:30b
-
Run the model:
ollama run qwen3-coder:30b
npm run buildThe OpenAI client is built in to the CLI. This can be used to connect to Ollama.
Other clients can be added via plugins. For example, Gemini support can be added by installing the Google GenAI plugin.
packages/cli/bin/run.js plugins link $pwd/packages/plugins/google-genai
packages/cli/bin/run.js clients create ollama-openai \
--type='openai:/v1/chat/completions' \
--options='{"apiKey": "ollama", "baseURL": "http://localhost:11434/v1"}'packages/cli/bin/run.js The Gemini CLI has been used as both a reference while writing Matt Code as well as a tool to help write Matt Code. Many thanks to Google for making this project open source and for providing a free tier.
If you find yourself inspired by the ideas in the project, an attribution to this project would be much appreciated.
