Name and Version
llama-b5576-bin-win-cuda-12.4-x64
version: 5576 (c9bbc77)
built with clang version 18.1.8 for x86_64-pc-windows-msvc
Operating systems
Windows
Which llama.cpp modules do you know to be affected?
llama-server
Command line
llama-server.exe -m "..\models\DeepSeek-R1-0528-Qwen3-8B-Q4_K_M.gguf"
Problem description & steps to reproduce
for release b5575, though process could be view, but when I switched to b5576, the thought process is missing
Anyone with similar issue?
First Bad Commit
b5576
#13933
Relevant log output