Skip to content

Commit db214fa

Browse files
Missing tokenizer.model error during gguf conversion (ggml-org#6443)
Co-authored-by: Jared Van Bortel <[email protected]>
1 parent 1ff4d9f commit db214fa

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

convert-hf-to-gguf.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -323,8 +323,7 @@ def _set_vocab_sentencepiece(self):
323323
toktypes: list[int] = []
324324

325325
if not tokenizer_path.is_file():
326-
print(f'Error: Missing {tokenizer_path}', file=sys.stderr)
327-
sys.exit(1)
326+
raise FileNotFoundError(f"File not found: {tokenizer_path}")
328327

329328
tokenizer = SentencePieceProcessor(str(tokenizer_path))
330329
vocab_size = self.hparams.get('vocab_size', tokenizer.vocab_size())

0 commit comments

Comments
 (0)