@@ -4,7 +4,7 @@ Command-line interface for AI-powered coding tasks.
44
55## Features
66
7- - 🤖 ** AI-Powered** : Leverages Anthropic's Claude and OpenAI models for intelligent coding assistance
7+ - 🤖 ** AI-Powered** : Leverages Anthropic's Claude, OpenAI models, and Ollama for intelligent coding assistance
88- 🛠️ ** Extensible Tool System** : Modular architecture with various tool categories
99- 🔄 ** Parallel Execution** : Ability to spawn sub-agents for concurrent task processing
1010- 📝 ** Self-Modification** : Can modify code, it was built and tested by writing itself
@@ -82,24 +82,23 @@ mycoder config set modelName gpt-4o-2024-05-13
8282
8383### Model Selection
8484
85- MyCoder supports both Anthropic and OpenAI models. You can configure which model to use with the following commands:
85+ MyCoder supports Anthropic, OpenAI, and Ollama models. You can configure which model provider and model name to use with the following commands:
8686
8787``` bash
88- # Use OpenAI's GPT-4o model
88+ # Use OpenAI models
8989mycoder config set modelProvider openai
90- mycoder config set modelName gpt-4o-2024-05-13
91-
92- # Use OpenAI's o3-mini model
93- mycoder config set modelProvider openai
94- mycoder config set modelName o3-mini-2024-07-18
90+ mycoder config set modelName gpt-4o-2024-05-13 # or any other OpenAI model
9591
96- # Use Anthropic's Claude 3.7 Sonnet model
92+ # Use Anthropic models
9793mycoder config set modelProvider anthropic
98- mycoder config set modelName claude-3-7-sonnet-20250219
94+ mycoder config set modelName claude-3-7-sonnet-20250219 # or any other Anthropic model
9995
100- # Use Anthropic's Claude 3 Opus model
101- mycoder config set modelProvider anthropic
102- mycoder config set modelName claude-3-opus-20240229
96+ # Use Ollama models (local)
97+ mycoder config set modelProvider ollama
98+ mycoder config set modelName llama3-groq-tool-use # or any other model available in your Ollama instance
99+
100+ # Configure custom Ollama server URL (default is http://localhost:11434/api)
101+ mycoder config set ollamaBaseUrl http://your-ollama-server:11434/api
103102```
104103
105104You can also specify the model provider and name directly when running a command:
@@ -114,6 +113,7 @@ mycoder --modelProvider openai --modelName gpt-4o-2024-05-13 "Your prompt here"
114113- ` headless ` : Run browser in headless mode with no UI showing (default: ` true ` )
115114- ` userSession ` : Use user's existing browser session instead of sandboxed session (default: ` false ` )
116115- ` pageFilter ` : Method to process webpage content: 'simple', 'none', or 'readability' (default: ` none ` )
116+ - ` ollamaBaseUrl ` : Base URL for Ollama API (default: ` http://localhost:11434/api ` )
117117
118118Example:
119119
@@ -126,13 +126,18 @@ mycoder config set userSession true
126126
127127# Use readability for webpage processing
128128mycoder config set pageFilter readability
129+
130+ # Set custom Ollama server URL
131+ mycoder config set ollamaBaseUrl http://your-ollama-server:11434/api
129132```
130133
131134## Environment Variables
132135
133136- ` ANTHROPIC_API_KEY ` : Your Anthropic API key (required when using Anthropic models)
134137- ` OPENAI_API_KEY ` : Your OpenAI API key (required when using OpenAI models)
135138
139+ Note: Ollama models do not require an API key as they run locally or on a specified server.
140+
136141## Development
137142
138143``` bash
0 commit comments