This repository was archived by the owner on Oct 8, 2024. It is now read-only.
ONNX Runtime Error? #72
Unanswered
CobaltAkaJames
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Every time I try to use the script
txt2img_onnx.py
orpython txt2img_onnx.py
as directed in the guide, it seems to fail, because best I can tell, no output images are being generated at all. Not even a blank image. I also triedpython txt2img_onnx.py --cpu-only
, thinking maybe it was a GPU/VRAM problem, but got the same results. Here's the full outputTraceback (most recent call last):
File "F:\Stable-Diff\OnnxDiffusersUI-main\txt2img_onnx.py", line 64, in
pipe = OnnxStableDiffusionPipeline.from_pretrained(
File "F:\Stable-Diff\OnnxDiffusersUI-main\virtualenv\lib\site-packages\diffusers\pipeline_utils.py", line 708, in from_pretrained
loaded_sub_model = load_method(os.path.join(cached_folder, name), **loading_kwargs)
File "F:\Stable-Diff\OnnxDiffusersUI-main\virtualenv\lib\site-packages\diffusers\onnx_utils.py", line 206, in from_pretrained
return cls._from_pretrained(
File "F:\Stable-Diff\OnnxDiffusersUI-main\virtualenv\lib\site-packages\diffusers\onnx_utils.py", line 173, in _from_pretrained
model = OnnxRuntimeModel.load_model(
File "F:\Stable-Diff\OnnxDiffusersUI-main\virtualenv\lib\site-packages\diffusers\onnx_utils.py", line 78, in load_model
return ort.InferenceSession(path, providers=[provider], sess_options=sess_options)
File "F:\Stable-Diff\OnnxDiffusersUI-main\virtualenv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 347, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "F:\Stable-Diff\OnnxDiffusersUI-main\virtualenv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 395, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Deserialize tensor onnx::Add_9330 failed.tensorprotoutils.cc:625 onnxruntime::utils::GetExtDataFromTensorProto External initializer: onnx::Add_9330 offset: 3343164160 size to read: 2560 given file_length: 2625634304 are out of bounds or can not be read in full.
What am I doing wrong?
Beta Was this translation helpful? Give feedback.
All reactions