-
Notifications
You must be signed in to change notification settings - Fork 6
Open
Description
Issue
When attempting to run inference_transformers() exactly as shown in the LLMSQL documentation, a TypeError is raised when passing the model_args and generate_kwargs parameters.
from llmsql import inference_transformers
# Run generation directly with transformers
results = inference_transformers(
model_or_model_name_or_path="Qwen/Qwen2.5-1.5B-Instruct",
output_file="path_to_your_outputs.jsonl",
num_fewshots=5,
batch_size=8,
max_new_tokens=256,
do_sample=False,
model_args={
"torch_dtype": "bfloat16",
},
generate_kwargs={
"do_sample": False,
},
)
Error
TypeError: inference_transformers() got an unexpected keyword argument 'model_args'
The error disappears when both model_args and generate_kwargs are removed, and the inference then runs successfully.
Expected behavior
inference_transformers() should accept model_args and generate_kwargs as shown in the documentation, or the documentation should be updated to reflect the actual supported arguments.
Metadata
Metadata
Assignees
Labels
No labels