Skip to content

feat: hide llama.cpp logs #106

@ExposedCat

Description

@ExposedCat

Feature Description

Hide all the logs produced by llama.cpp about model running/parameters/etc

The Solution

Either ignore output or process it and receive as a callback when running model internally

Considered Alternatives

In general non-optional logging in (any) libraries is a bad practice

Additional Context

Thanks for your work ❤️

Related Features to This Feature Request

  • Metal support
  • CUDA support
  • Grammar

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, and I know how to start.

Metadata

Metadata

Assignees

Labels

new featureNew feature or requestreleasedreleased on @betaroadmapPart of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1)

Type

No type

Projects

Status

Done

Relationships

None yet

Development

No branches or pull requests

Issue actions