-
https://docs.continue.dev/customize/deep-dives/autocomplete Also, what if I already have the 30b model installed and running, can I just use that instead of the 4b model? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
I just searched Ollama desktop and confirmed it is. My suggestion is double checking the Ollama docs, but a simple query to an LLM will give you the answer below.
Yes, you can absolutely use the 30B model instead of the 4B! Here's how:
The 30B model will likely give better autocomplete suggestions. The 4B model will be faster for autocomplete (which is usually preferred) |
Beta Was this translation helpful? Give feedback.
I just searched Ollama desktop and confirmed it is. My suggestion is double checking the Ollama docs, but a simple query to an LLM will give you the answer below.
Yes, you can absolutely use the 30B model instead of the 4B! Here's how:
yamlmodels:
The …