Langchain is not working for gguf files #1776
Closed
silicology1
started this conversation in
General
Replies: 1 comment
-
Seems there was some path error, its working fine with following example: from langchain.llms import GPT4All
from langchain.chains import LLMChain
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain.prompts import PromptTemplate
from pathlib import Path
template = """
Let's think step by step of the question: {question}
"""
prompt = PromptTemplate(template=template, input_variables=["question"])
callbacks = [StreamingStdOutCallbackHandler()]
# local_path = (Path.home()/'.local'/'share'/'nomic.ai'/'GPT4All'/'orca-mini-3b-gguf2-q4_0.gguf')
# print(local_path)
llm = GPT4All(
streaming=True,
model="model/orca-mini-3b-gguf2-q4_0.gguf",
verbose=True,
callbacks=callbacks,
)
llm_chain = LLMChain(prompt=prompt, llm=llm, verbose=True)
question = "What NFL team won the Super Bowl in the year Justin Bieber was born?"
llm_chain.run(question) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Here is the docs for gpt4all in langchain.
Is there any updated docs to use new models in langchain?
It gives error:
/home/amiya/.local/share/nomic.ai/GPT4All/orca-mini-3b-gguf2-q4_0.gguf
Traceback (most recent call last):
File "/home/amiya/Documents/workspace/python/pdf-chatbot/src/pdf_chatbot/langchain_gpt4.py", line 21, in
llm = GPT4All(model=local_path, callbacks=callbacks, verbose=True)
File "/home/amiya/Documents/workspace/python/pdf-chatbot/.venv/lib/python3.10/site-packages/langchain_core/load/serializable.py", line 107, in init
super().init(**kwargs)
File "/home/amiya/Documents/workspace/python/pdf-chatbot/.venv/lib/python3.10/site-packages/pydantic/v1/main.py", line 339, in init
values, fields_set, validation_error = validate_model(pydantic_self.class, data)
File "/home/amiya/Documents/workspace/python/pdf-chatbot/.venv/lib/python3.10/site-packages/pydantic/v1/main.py", line 1102, in validate_model
values = validator(cls_, values)
File "/home/amiya/Documents/workspace/python/pdf-chatbot/.venv/lib/python3.10/site-packages/langchain_community/llms/gpt4all.py", line 139, in validate_environment
full_path = values["model"]
KeyError: 'model'
Beta Was this translation helpful? Give feedback.
All reactions