Skip to content

推理报错 #12

@huangqiongyan123

Description

@huangqiongyan123

(zhihai1) huangqy@worker-7:~/wisdomInterrogatory-main$ python infer.py
Traceback (most recent call last):
File "/data/huangqy/wisdomInterrogatory-main/infer.py", line 100, in
pred_crimes = get_all_response(facts)
File "/data/huangqy/wisdomInterrogatory-main/infer.py", line 51, in get_all_response
resp = generate_response(prompt)
File "/data/huangqy/wisdomInterrogatory-main/infer.py", line 20, in generate_response
tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
File "/data/huangqy/miniconda3/envs/zhihai1/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 755, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/data/huangqy/miniconda3/envs/zhihai1/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2024, in from_pretrained
return cls._from_pretrained(
File "/data/huangqy/miniconda3/envs/zhihai1/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2256, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/data/huangqy/.cache/huggingface/modules/transformers_modules/zju_model_0813_100k/tokenization_baichuan.py", line 74, in init
super().init(
File "/data/huangqy/miniconda3/envs/zhihai1/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 367, in init
self._add_tokens(
File "/data/huangqy/miniconda3/envs/zhihai1/lib/python3.10/site-packages/transformers/tokenization_utils.py", line 467, in _add_tokens
current_vocab = self.get_vocab().copy()
File "/data/huangqy/.cache/huggingface/modules/transformers_modules/zju_model_0813_100k/tokenization_baichuan.py", line 109, in get_vocab
vocab = {self.convert_ids_to_tokens(i): i for i in range(self.vocab_size)}
File "/data/huangqy/.cache/huggingface/modules/transformers_modules/zju_model_0813_100k/tokenization_baichuan.py", line 105, in vocab_size
return self.sp_model.get_piece_size()
AttributeError: 'BaiChuanTokenizer' object has no attribute 'sp_model'
作者您好,推理的时候为什么老是报这个错

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions