Skip to content

Conversation

@LoicDagnas
Copy link

Following this issue #995

After this PR on the llama.cpp repo the loglevel defined on both llama.cpp and LlamaSharp sides were not aligned anymore. This misalignment made the model weight loading fail when a custom logger is used.

Help

I am just wondering if it is correct to consider the CONT log level as equivalent to NONE.

@martindevans
Copy link
Member

It looks like the idea of GGML_LOG_LEVEL_CONT is simply to re-use whatever the level was for the last message. Setting that to None will presumably mean that some message using this will be cut in half (because the CONT part will be hidden, due to treating it as None).

The best way I can think of to handle this is to have a threadstatic field which stores the last log level used. I'll add a comment inline to show what I mean.

Continue = 5,
}

internal static class LLamaLogLevelExtensions
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
internal static class LLamaLogLevelExtensions
internal static class LLamaLogLevelExtensions
{
[ThreadStatic] private static LogLevel _previous;
public static LogLevel ToLogLevel(this LLamaLogLevel llama)
{
_previous = (llama) switch
{
LLamaLogLevel.None => LogLevel.None,
LLamaLogLevel.Debug => LogLevel.Debug,
LLamaLogLevel.Info => LogLevel.Information,
LLamaLogLevel.Warning => LogLevel.Warning,
LLamaLogLevel.Error => LogLevel.Error,
LLamaLogLevel.Continue => _previous,
_ => throw new ArgumentOutOfRangeException(nameof(llama), llama, null)
};
return _previous;
}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

(Note that I haven't tested this, but hopefully it illustrates what I mean)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it, good idea

@martindevans
Copy link
Member

Looks good to me, thanks for fixing this! Just waiting for the tests to run and then I'll merge it :)

@LoicDagnas
Copy link
Author

Hum I don't get the CI error running locally

dotnet test LLamaSharp.sln -c release -l "console;verbosity=detailed" --diag:logs/log.txt --filter Category!=NoCI

@martindevans
Copy link
Member

I'll re-run it, unfortunately our CI is flakey sometimes

@martindevans
Copy link
Member

Looks like it worked this time.

@martindevans martindevans merged commit a431792 into SciSharp:master Nov 26, 2024
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants