Skip to content
Discussion options

You must be logged in to vote

How do I know it is? Using ollama right now

I just searched Ollama desktop and confirmed it is. My suggestion is double checking the Ollama docs, but a simple query to an LLM will give you the answer below.

Also, what if I already have the 30b model installed and running, can I just use that instead of the 4b model?

Yes, you can absolutely use the 30B model instead of the 4B! Here's how:
yamlmodels:

  - name: Qwen3 30B without Thinking for Autocomplete  
    provider: ollama
    model: qwen3:30b  # or whatever your exact model name is
    roles:
      - autocomplete
    requestOptions:
      extraBodyProperties:
        think: false  # turning off the thinking for speed
        

The …

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by bdougie
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Help
Labels
model:qwen3 Relates to the Qwen3 model
2 participants