- 
          
 - 
                Notifications
    
You must be signed in to change notification settings  - Fork 11k
 
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Your current environment
Environment: - vLLM Version: 0.11.0 (from Docker image vllm/vllm-openai:latest) - Model: THUDM/glm-4-9b-chat and Qwen/Qwen2.5-14B-Instruct - Server: Docker container with default settings - API: OpenAI-compatible /v1/chat/completions
Configuration:
docker run -d --gpus all -p 8000:8000 
vllm/vllm-openai:latest 
--model THUDM/glm-4-9b-chat 
--max-model-len 8192
π Describe the bug
When sending a valid JSON request with 17 tools in the tools array, vLLM returns:
400 Bad Request
{"error": {"message": "1 validation error for list[function-wrap[log_extra_fields()]]\n Invalid JSON: EOF while parsing a string at line 57", "type":
"BadRequestError"}}
The error location varies (lines 23, 57, 84, 171, 261, 271, 346, 1024 in different requests), but always reports "EOF while parsing" errors.
The input_value in the error shows corrupted/truncated data (just newlines/whitespace).
Before submitting a new issue...
- Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
 
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working