-
Notifications
You must be signed in to change notification settings - Fork 100
Open
Description
The code functions properly with standard Qwen3 Q4 MLX models. Nevertheless, when I switch to the Q4 DWQ models, errors occur. I've tested several models such as qwen3-4b-dwq-053125, qwen3-8b-dwq-053125, and qwen3-14b-dwq-053125, but all of them result in errors.:
Traceback (most recent call last):
File "/Users/Larkin/workspace/gitee/ai/lmstudio-json.py", line 17, in <module>
result = model.respond("/no_think\nTell me about The Hobbit", response_format=BookSchema)
lmstudio.LMStudioServerError: Chat response error: Error rendering prompt with jinja template: "Parser Error: Expected closing statement token. OpenSquareBracket !== CloseStatement.".
This is usually an issue with the model's prompt template. If you are using a popular model, you can try to search the model under lmstudio-community, which will have fixed prompt templates. If you cannot find one, you are welcome to post this issue to our discord or issue tracker on GitHub. Alternatively, if you know how to write jinja templates, you can override the prompt template in My Models > model settings > Prompt Template.
The script:
"""Example script demonstrating an interactive LLM chatbot."""
import json
import lmstudio as lms
class BookSchema(lms.BaseModel):
"""Structured information about a published book."""
title: str
author: str
year: int
# model = lms.llm("qwen3-8b-mlx") # runs well
model = lms.llm("qwen3-8b-dwq-053125") # runs with error
# # add /no_think to the prompt to disable qwen3 thinking
result = model.respond("/no_think\nTell me about The Hobbit", response_format=BookSchema)
book = result.parsed
print(json.dumps(book, indent=2))
env:m4 16g ram
Metadata
Metadata
Assignees
Labels
No labels