An agentic coding assistant for Ollama - like Claude Code, but running locally!
OllamaCoder transforms your local Ollama models into a powerful autonomous coding agent with tool use, multi-step task execution, and project-aware context.
- π€ Agentic Architecture: Autonomous multi-step task execution with planning, execution, and verification
- π‘οΈ Safety Hooks: Blocks dangerous bash commands (rm -rf /, sudo, fork bombs, etc.)
- π§ Full Tool System: bash, file operations, git, code search, web search, and more
- π Think Tool: Structured reasoning for complex problems (like extended thinking)
- π Multi-Edit: Batch file edits in a single operation
- π Autonomous Mode: Let the AI work through complex tasks independently
- π‘ Streaming Responses: Real-time token streaming for better UX
- πΌοΈ Vision/Image Support: Analyze images with multimodal models
- π Context Management: Automatic conversation summarization to stay within context limits
- βοΈ Hierarchical Config: User and project-level settings with OLLAMA.md context files
- π Remote Ollama: Connect to remote Ollama servers via OLLAMA_HOST
- πΎ Session Persistence: SQLite-backed session storage with full-text search
- π Custom Commands: Create your own slash commands in markdown
- π€ Subagents: Spawn specialized AI agents for focused tasks
- π― Skills: Progressive expertise loading based on keywords
- π¨ Rich Output: Beautiful terminal output with syntax highlighting (optional)
brew tap lalomorales22/ollama-coder
brew install ollama-coderpip install ollama-coderFor enhanced terminal output with streaming (recommended):
pip install ollama-coder[rich]
# or
pip install ollama-coder richgit clone https://github.com/lalomorales22/ollama-coder.git
cd ollama-coder
pip install -e .```bash
ollama-coder
ollama-coder --auto
ollama-coder -p "fix the bug in app.py"
ollama-coder --model gpt-oss:20b
ollama-coder --dir /path/to/project ```
```bash
ollama-coder --headless -p "fix lint errors"
ollama-coder --headless --output json -p "analyze code"
ollama-coder --headless --no-write --max-tools 10 --timeout 120 -p "review"
ollama-coder --bash-timeout 600 -p "create a react app" ```
Exit Codes:
0- Success1- Failure/Error2- Needs human intervention
| Tool | Description |
|---|---|
| `think` | Structured reasoning for complex problems |
| `bash` | Execute shell commands |
| `read_file` | Read file contents with optional line ranges |
| `write_file` | Create or overwrite files |
| `edit_file` | Make surgical edits using find/replace |
| `multi_edit` | Batch multiple edits in one operation |
| `list_directory` | Explore project structure |
| `search_code` | Search for patterns using grep/ripgrep |
| `glob` | NEW Find files matching glob patterns |
| `grep` | NEW Regex search with context lines |
| `fetch_url` | NEW Fetch and parse web content |
| `screenshot` | NEW Browser screenshots (requires playwright) |
| `git` | Version control operations |
| `web_search` | Search the web (when configured) |
| Command | Description |
|---|---|
/auto |
Toggle autonomous mode |
/sessions |
List recent sessions |
/resume |
Resume session: /resume [id] |
/search |
Search sessions: /search <query> |
/session |
Session info/actions: `/session title |
/branch |
Branch current session |
/new |
Start new session |
/commands |
List custom commands |
/subagents |
List available subagents |
/skills |
List and manage skills |
/model |
Show or set the active model |
/models |
List installed Ollama models |
/streaming |
Toggle streaming responses |
/image |
Attach image: /image <path> <message> |
/context |
Show context usage stats |
/config |
Show current configuration |
/clear |
Clear conversation history |
/help |
Show available commands |
/quit |
Exit OllamaCoder |
User config: `~/.ollamacode/settings.json`
```json { "model": "gpt-oss:20b-cloud", "max_iterations": 25, "max_tool_rounds": 8, "temperature": 0.7, "streaming": true, "bash": { "timeout_sec": 300, "long_running_timeout_sec": 600 }, "vision": { "enabled": true, "max_image_size": 4194304 }, "context_management": { "enabled": true, "summarize_threshold": 0.75, "keep_recent_messages": 10 }, "ollama": { "host": "http://127.0.0.1:11434", "timeout_sec": 300 }, "web_search": { "enabled": false, "provider": "custom", "endpoint": "", "api_key": "" } } ```
Create `OLLAMA.md` files to provide project-specific context:
- `~/.ollamacode/OLLAMA.md` - User-level context (applies to all projects)
- `.ollamacode/OLLAMA.md` - Project-level context (in your project root)
Connect to remote Ollama servers:
```bash
export OLLAMA_HOST=http://your-server:11434 ollama-coder
ollama-coder
/host http://your-server:11434 ```
- Python 3.9+
- Ollama server running locally or accessible remotely
- Optional: `rich` package for enhanced terminal output
| Feature | OllamaCoder | Claude Code |
|---|---|---|
| Local/Private | β | β |
| Free | β | β |
| Tool Use | β | β |
| Autonomous Mode | β | β |
| Thinking Tool | β | β |
| Multi-Edit | β | β |
| Streaming | β | β |
| Image Analysis | β | β |
| Context Management | β | β |
| Web Search | β | β |
| Project Context | β | β |
| Session Persistence | β | β |
| Safety Hooks | β | β |
| Headless/CI Mode | β | β |
| Subagents | β | β |
| Custom Commands | β | β |
| Skills System | β | β |
| MCP Support | π | β |
MIT
- GitHub: https://github.com/lalomorales22/ollama-coder
- PyPI: https://pypi.org/project/ollama-coder/
- Ollama: https://ollama.ai/
Version bumping and publishing is automated via GitHub Actions. Just:
- Bump version in
pyproject.tomlandollama_coder/__init__.py - Commit and push
- Create and push a tag:
git tag v0.2.7 git push origin v0.2.7
Current Version: 0.2.7