We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent bf21e2b commit 756dd9fCopy full SHA for 756dd9f
docs/advanced-usage/local-models.md
@@ -78,7 +78,7 @@ Roo Code currently supports two main local model providers:
78
* Select "ollama" as the API Provider.
79
* Enter the Model name from the previous step (e.g., `your_model_name`).
80
* (Optional) You can configure the base URL if you're running Ollama on a different machine. The default is `http://localhost:11434`.
81
- * (Optional) Configure Model context size in Advance settings, so Roo Code knows how to manage its sliding window.
+ * (Optional) Configure Model context size in Advanced settings, so Roo Code knows how to manage its sliding window.
82
83
## Setting Up LM Studio
84
0 commit comments