Skip to content

Bug: llama-cli does not show the results of the performance test when SIGINT #9558

@ownia

Description

@ownia

What happened?

When running llama-cli in conversation mode, press Ctrl+C to interject it did not print the results of the performance info.

git bisect show it's 6262d13.

e6deac3

>
llama_perf_sampler_print:    sampling time =      17.31 ms /   124 runs   (    0.14 ms per token,  7164.32 tokens per second)
llama_perf_context_print:        load time =    2548.25 ms
llama_perf_context_print: prompt eval time =    4104.68 ms /    25 tokens (  164.19 ms per token,     6.09 tokens per second)
llama_perf_context_print:        eval time =    6035.09 ms /   109 runs   (   55.37 ms per token,    18.06 tokens per second)
llama_perf_context_print:       total time =   36065.20 ms /   134 tokens
localhost:~/code/kleidiai/llama.cpp #

6262d13: (empty output after Ctrl+C)

> localhost:~/code/kleidiai/llama.cpp #

Name and Version

localhost:~/code/kleidiai/llama.cpp # ./llama-cli --version
version: 3787 (6026da5)
built with cc (SUSE Linux) 14.2.0 for aarch64-suse-linux

What operating system are you seeing the problem on?

Linux

Relevant log output

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bug-unconfirmedmedium severityUsed to report medium severity bugs in llama.cpp (e.g. Malfunctioning Features but still useable)

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions