-
Notifications
You must be signed in to change notification settings - Fork 1k
Open
Description
Hello, I am using stablelm-zephyr-3b through transformers on a Windows 11 machine. My package versions are:
transformers 4.36.2
torch 2.0.1+cu118
huggingface-hub 0.19.4
Python 3.9.16
My code is:
model_name_or_path = "stabilityai/stablelm-zephyr-3b"
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
device_map='auto',
load_in_8bit=False,
trust_remote_code=True,
revision="main")
tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True)
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=40,
do_sample=True,
temperature=0.1,
top_p=0.95,
top_k=40
)
When pipe is called, it produces this error:
Setting `pad_token_id` to `eos_token_id`:0 for open-end generation.
Windows fatal exception: access violation
Thread 0x000075d8 (most recent call first):
File "C:\Users\laura\anaconda3\envs\transformers\lib\threading.py", line 316 in wait
File "C:\Users\laura\anaconda3\envs\transformers\lib\threading.py", line 581 in wait
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\tqdm\_monitor.py", line 60 in run
File "C:\Users\laura\anaconda3\envs\transformers\lib\threading.py", line 980 in _bootstrap_inner
File "C:\Users\laura\anaconda3\envs\transformers\lib\threading.py", line 937 in _bootstrap
Main thread:
Current thread 0x0000423c (most recent call first):
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\torch\storage.py", line 234 in __getitem__
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\accelerate\utils\offload.py", line 169 in __getitem__
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\accelerate\utils\offload.py", line 118 in __getitem__
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\accelerate\hooks.py", line 294 in pre_forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\accelerate\hooks.py", line 160 in new_forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\torch\nn\modules\module.py", line 1501 in _call_impl
File "C:\Users\laura\.cache\modules\transformers_modules\stabilityai\stablelm-zephyr-3b\9974c58a0ec4be4cd6f55e814a2a93b9cf163823\modeling_stablelm_epoch.py", line 291 in forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\accelerate\hooks.py", line 165 in new_forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\torch\nn\modules\module.py", line 1501 in _call_impl
File "C:\Users\laura\.cache\modules\transformers_modules\stabilityai\stablelm-zephyr-3b\9974c58a0ec4be4cd6f55e814a2a93b9cf163823\modeling_stablelm_epoch.py", line 501 in forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\torch\nn\modules\module.py", line 1501 in _call_impl
File "C:\Users\laura\.cache\modules\transformers_modules\stabilityai\stablelm-zephyr-3b\9974c58a0ec4be4cd6f55e814a2a93b9cf163823\modeling_stablelm_epoch.py", line 597 in forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\accelerate\hooks.py", line 165 in new_forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\torch\nn\modules\module.py", line 1501 in _call_impl
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\transformers\generation\utils.py", line 2861 in sample
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\transformers\generation\utils.py", line 1764 in generate
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\torch\utils\_contextlib.py", line 115 in decorate_context
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\transformers\pipelines\text_generation.py", line 271 in _forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\transformers\pipelines\base.py", line 1046 in forward
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\transformers\pipelines\base.py", line 1147 in run_single
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\transformers\pipelines\base.py", line 1140 in __call__
File "C:\Users\laura\anaconda3\envs\transformers\lib\site-packages\transformers\pipelines\text_generation.py", line 208 in __call__
File "c:\users\laura\documents\code\mentalmodel-toy.py", line 155 in ask
File "C:\Users\laura\AppData\Local\Temp\ipykernel_26296\1451652156.py", line 1 in <module>
Restarting kernel...
If I set trust_remote_code to False, then I get this error:
Traceback (most recent call last):
File ~\anaconda3\envs\transformers\lib\site-packages\spyder_kernels\py3compat.py:356 in compat_exec
exec(code, globals, locals)
File c:\users\laura\documents\code\mentalmodel-toy.py:18
model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
File ~\anaconda3\envs\transformers\lib\site-packages\transformers\models\auto\auto_factory.py:526 in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
File ~\anaconda3\envs\transformers\lib\site-packages\transformers\models\auto\configuration_auto.py:1085 in from_pretrained
trust_remote_code = resolve_trust_remote_code(
File ~\anaconda3\envs\transformers\lib\site-packages\transformers\dynamic_module_utils.py:621 in resolve_trust_remote_code
raise ValueError(
ValueError: Loading stabilityai/stablelm-zephyr-3b requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.
I assumed that the access violation had to do with allowing access to remote code, but a Google search suggests it could be something (anything) entirely different. I have never encountered this problem using transformers before, so I'm guessing the conflict has something to do with this model. Any ideas?
mbracht-ifx and jo-mueller
Metadata
Metadata
Assignees
Labels
No labels