using ollama reports an error in llmchain: error_wrappers #23401
H9990HH969
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I want to use the ollama llm deployed on another service. The code displayed below works fine with direct use of ollama, but errors occur when using llmchain.
code:
from langchain.chains import LLMChain
from langchain_community.llms import Ollama
model = Ollama(
base_url='http://XXXXXXX:11434',
model="qwen:32b",
)
print(model.invoke("tell a joke")) #pass
chain = LLMChain( llm=model)
print(chain("tell a joke")) #error
error:

Beta Was this translation helpful? Give feedback.
All reactions