Skip to content

Validation error for LLMChain llm value is not a valid dict / Could not infer framework #17

@kripper

Description

@kripper

When doing this:

model = GeneratorCT2fromHfHub(
	model_name_or_path = model_dir,
	device = "cpu",
	compute_type = "int8",
	tokenizer = tokenizer
)
qa = RetrievalQA.from_chain_type(llm = model, chain_type = 'stuff', retriever = retriever)

I get this error:

Traceback (most recent call last):
  File "ask-doc.py", line 76, in <module>
    qa = RetrievalQA.from_chain_type(llm = model, chain_type = 'stuff', retriever = retriever)
  File "/home/ia/.local/lib/python3.8/site-packages/langchain/chains/retrieval_qa/base.py", line 100, in from_chain_type
    combine_documents_chain = load_qa_chain(
  File "/home/ia/.local/lib/python3.8/site-packages/langchain/chains/question_answering/__init__.py", line 249, in load_qa_chain
    return loader_mapping[chain_type](
  File "/home/ia/.local/lib/python3.8/site-packages/langchain/chains/question_answering/__init__.py", line 73, in _load_stuff_chain
    llm_chain = LLMChain(
  File "/home/ia/.local/lib/python3.8/site-packages/langchain/load/serializable.py", line 74, in __init__
    super().__init__(**kwargs)
  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for LLMChain
llm
  value is not a valid dict (type=type_error.dict)

When trying with a pipeline:

from transformers import pipeline
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, max_length=256, temperature=0.7, top_p=0.95, repetition_penalty=1.15)
local_llm = HuggingFacePipeline(pipeline=pipe)

# Create QA chain
qa = RetrievalQA.from_chain_type(llm = local_llm, chain_type = 'stuff', retriever = retriever)

I get this:

Traceback (most recent call last):
  File "ask-doc.py", line 77, in <module>
    pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, max_length=256, temperature=0.7, top_p=0.95, repetition_penalty=1.15)
  File "/home/ia/.local/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 788, in pipeline
    framework, model = infer_framework_load_model(
  File "/home/ia/.local/lib/python3.8/site-packages/transformers/pipelines/base.py", line 281, in infer_framework_load_model
    framework = infer_framework(model.__class__)
  File "/home/ia/.local/lib/python3.8/site-packages/transformers/utils/generic.py", line 583, in infer_framework
    raise TypeError(f"Could not infer framework from class {model_class}.")
TypeError: Could not infer framework from class <class 'hf_hub_ctranslate2.translate.GeneratorCT2fromHfHub'>.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions