Skip to content

Misc. bug: llama-server didn't display thought process since b5576 #13981

@BVEsun

Description

@BVEsun

Name and Version

llama-b5576-bin-win-cuda-12.4-x64
version: 5576 (c9bbc77)
built with clang version 18.1.8 for x86_64-pc-windows-msvc

Operating systems

Windows

Which llama.cpp modules do you know to be affected?

llama-server

Command line

llama-server.exe -m "..\models\DeepSeek-R1-0528-Qwen3-8B-Q4_K_M.gguf"

Problem description & steps to reproduce

for release b5575, though process could be view, but when I switched to b5576, the thought process is missing

Anyone with similar issue?

First Bad Commit

b5576
#13933

Relevant log output

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions