Skip to content
Discussion options

You must be logged in to vote

Model parameters like temperature, max_tokens, top_k, and num_ctx are hardcoded in Perplexica's Ollama provider at src/lib/models/providers/ollama.ts:93. The temperature is fixed at 0.7, and other Ollama-specific parameters (num_ctx, top_k) aren't exposed. To customize these, you must modify the loadChatModel method to pass additional options to the ChatOllama constructor, such as { temperature: 0.7, model: key, baseUrl: this.config.baseURL, numCtx: 4096, topK: 40 }.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@stuckwi
Comment options

Answer selected by stuckwi
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants