Replies: 2 comments 2 replies
-
I'm seeing it too, just using the constitution sample file. |
Beta Was this translation helpful? Give feedback.
1 reply
-
This is something related to transformers and how the model creators are implementing them. Check out this. With the update this morning, you can now use quantized models. That will hopefully resolve this. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
ingest.py...works v. nicely but run_localGPT.py throws errors :(
I am using a single PDF file with nothing but pure text. 17 pages
I upped recusionlimit from default 1000 to 1500 but no joy. Would anyone know what the reason is
2023-06-11 16:45:39,762 - INFO - run_localGPT.py:26 - Loading Model: TheBloke/vicuna-7B-1.1-HF, on : cuda
2023-06-11 16:45:39,762 - INFO - run_localGPT.py:27 - This action can take a few minutes!
Traceback (most recent call last):
File "/home/my_pc/localGPT/run_localGPT.py", line 158, in
main()
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/click/core.py", line 1128, in call
return self.main(*args, **kwargs)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/click/core.py", line 1053, in main
rv = self.invoke(ctx)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/click/core.py", line 1395, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/click/core.py", line 754, in invoke
return __callback(*args, **kwargs)
File "/home/my_pc/localGPT/run_localGPT.py", line 122, in main
llm = load_model(device_type)
File "/home/my_pc/localGPT/run_localGPT.py", line 30, in load_model
tokenizer = AutoTokenizer.from_pretrained(model_id)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 691, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1825, in from_pretrained
return cls._from_pretrained(
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1988, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 111, in init
self.update_post_processor()
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/models/llama/tokenization_llama_fast.py", line 121, in update_post_processor
bos_token_id = self.bos_token_id
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1136, in bos_token_id
return self.convert_tokens_to_ids(self.bos_token)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 250, in convert_tokens_to_ids
return self._convert_token_to_id_with_added_voc(tokens)
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 257, in _convert_token_to_id_with_added_voc
return self.unk_token_id
File "/home/my_pc/anaconda3/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1155, in unk_token_id
... The last four last lines are repeated many times over until we arrive at this last output...
RecursionError: maximum recursion depth exceeded while calling a Python object
2023-06-11 16:48:02,429 - INFO - duckdb.py:414 - Persisting DB to disk, putting it in the save folder: /home/my_pc/localGPT/DB
Thanks for your help...cheers rajfal
Beta Was this translation helpful? Give feedback.
All reactions