ChatOllama doesn't accept num_batch parameter #26661
ekinsenler
announced in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
ChatOllama()
doesn't parsenum_batch
parameter that is supported by ollama.Motivation
To be able to utilize all available memory.
Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions