Replies: 3 comments
-
Look like this might be related to the torch library - microsoft/DirectML#400 |
Beta Was this translation helpful? Give feedback.
-
still not fixed for directML ?? i'm also encountered the same error. |
Beta Was this translation helpful? Give feedback.
-
whenever i enabled the magic prompt option, i'll got the following error message. Launching Web UI with arguments: --use-cpu interrogate --listen --medvram --disable-nan-check --opt-sdp-attention --sub-quad-q-chunk-size 256 --sub-quad-kv-chunk-size 256 --sub-quad-chunk-threshold 85 To create a public link, set |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
im feeling luck可以用,但magic prompt不能用,不会帮助我添加prompt,下载完了models好像不能调用
Error running process: H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\extensions\sd-dynamic-prompts-main\scripts\dynamic_prompting.py
Traceback (most recent call last):
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\modules\scripts.py", line 417, in process
script.process(p, *script_args)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\extensions\sd-dynamic-prompts-main\sd_dynamic_prompts\dynamic_prompting.py", line 452, in process
all_prompts, all_negative_prompts = generate_prompts(
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\extensions\sd-dynamic-prompts-main\sd_dynamic_prompts\dynamic_prompting.py", line 81, in generate_prompts
all_prompts = prompt_generator.generate(prompt, num_prompts) or [""]
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 148, in generate
magic_prompts = self._generate_magic_prompts(prompts)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 194, in _generate_magic_prompts
prompts = self._generator(
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\pipelines\text_generation.py", line 202, in call
return super().call(text_inputs, **kwargs)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\pipelines\base.py", line 1063, in call
outputs = [output for output in final_iterator]
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\pipelines\base.py", line 1063, in
outputs = [output for output in final_iterator]
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\pipelines\pt_utils.py", line 124, in next
item = next(self.iterator)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\pipelines\pt_utils.py", line 125, in next
processed = self.infer(item, **self.params)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\pipelines\base.py", line 990, in forward
model_outputs = self._forward(model_inputs, **forward_params)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\pipelines\text_generation.py", line 244, in forward
generated_sequence = self.model.generate(input_ids=input_ids, attention_mask=attention_mask, **generate_kwargs)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\torch\autograd\grad_mode.py", line 27, in decorate_context
return func(*args, **kwargs)
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\generation\utils.py", line 1571, in generate
return self.sample(
File "H:\ruanjian\stable-diffusion-webui-directml-02-19bei0413fugai\python\lib\site-packages\transformers\generation\utils.py", line 2515, in sample
unfinished_sequences = input_ids.new(input_ids.shape[0]).fill(1)
RuntimeError: new(): expected key in DispatchKeySet(CPU, CUDA, HIP, XLA, MPS, IPU, XPU, HPU, Lazy, Meta) but got: PrivateUse1
Beta Was this translation helpful? Give feedback.
All reactions