-
|
Hi, sorry for the totally newb question. I'm running ollama in a docker container. In OWUI, I'd have to set advanced parameters like temp, max_tokens, top_k, and num_ctx to get any decent output from any of the local models. My question is how do I set those parameters when Perplexica connects directly to ollama? How do I know what the default parameters are when Perplexica connects to a model on ollama? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Model parameters like temperature, max_tokens, top_k, and num_ctx are hardcoded in Perplexica's Ollama provider at |
Beta Was this translation helpful? Give feedback.
Model parameters like temperature, max_tokens, top_k, and num_ctx are hardcoded in Perplexica's Ollama provider at
src/lib/models/providers/ollama.ts:93. The temperature is fixed at 0.7, and other Ollama-specific parameters (num_ctx, top_k) aren't exposed. To customize these, you must modify theloadChatModelmethod to pass additional options to theChatOllamaconstructor, such as{ temperature: 0.7, model: key, baseUrl: this.config.baseURL, numCtx: 4096, topK: 40 }.