-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Open
Description
Prerequisites
Please answer the following questions for yourself before submitting an issue.
- I am running the latest code. Development is very rapid so there are no tagged versions as of now.
- I carefully followed the README.md.
- I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- I reviewed the Discussions, and have a new bug or useful enhancement to share.
Expected Behavior
The dictionary returned by _LlamaModel.metadata()
should include tokenizer.ggml.tokens
as a key, so the vocabulary of the model can be accessed from the high-level API.
Current Behavior
The dictionary returned by _LlamaModel.metadata()
does not include tokenizer.ggml.tokens
as a key, so the vocabulary of the model cannot be accessed from the high-level API.
Environment and Context
Running latest llama-cpp-python built from source - package version 0.2.76 at the time of writing.
Steps to Reproduce
- Construct an instance of
Llama
- View
Llama.metadata
- Look for a key called
tokenizer.ggml.metadata
- Do not find it
Metadata
Metadata
Assignees
Labels
No labels