Skip to content

[Bug] Qwen3 model Can not use function calling #3344

@wawaRou

Description

@wawaRou

🐛 Bug

According to the sample in the official documentation(https://llm.mlc.ai/docs/deploy/rest.html), when I test function calling, none of the 'Qwen3' series models I deployed locally (' Qwen3-14b-Q4f16_1-MLC ', 'Qwen3-8B-Q4f16_1-MLC') could recognize the tool.

To Reproduce

The code I used as follows

import requests
import json

tools = [
   {
      "type": "function",
      "function": {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
               "type": "object",
               "properties": {
                  "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                  },
                  "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
               },
               "required": ["location"],
            },
      },
   }
]

payload = {
   "model": "./Qwen3-8B-Q4f16_1-MLC",
   "messages": [
      {
            "role": "user",
            "content": "What is the current weather in Pittsburgh, PA in fahrenheit?",
      }
   ],
   "stream": False,
   "tools": tools,
}

r = requests.post("http://127.0.0.1:8000/v1/chat/completions", json=payload)
print(f"{r.json()['choices'][0]['message']['tool_calls'][0]['function']}\n")

Output as follows

Traceback (most recent call last):
  File "/home/otcaix/dev-proj/tmp/test2.py", line 38, in <module>
    print(f"{r.json()['choices'][0]['message']['tool_calls'][0]['function']}\n")
TypeError: 'NoneType' object is not subscriptable

Expected behavior

Output: {'name': 'get_current_weather', 'arguments': {'location': 'Pittsburgh, PA', 'unit': 'fahrenheit'}}

Environment

  • Platform (e.g. WebGPU/Vulkan/IOS/Android/CUDA): CUDA
  • Operating system (e.g. Ubuntu/Windows/MacOS/...): Ubuntu 22.04
  • Device (e.g. iPhone 12 Pro, PC+RTX 3090, ...): Nvidia Jetson Orin 64GB
  • How you installed MLC-LLM (conda, source): MLC-LLM docker images(provided by https://github.com/dusty-nv/jetson-containers/tree/master/packages/llm/mlc)
  • Any other relevant information: mlc docker image version is 0.20.0

Additional context

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugConfirmed bugs

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions