Skip to content

[Build] Is the failure of the safetensors to ONNX conversion, due to the BitShift operator's lack of support for tensor(int32), invalid? What caused this? #25697

@YouLei0106

Description

@YouLei0106

Describe the issue

Traceback (most recent call last):
File "/opt/conda/bin/optimum-cli", line 7, in
sys.exit(main())
File "/opt/conda/lib/python3.9/site-packages/optimum/commands/optimum_cli.py", line 208, in main
service.run()
File "/opt/conda/lib/python3.9/site-packages/optimum/commands/export/onnx.py", line 276, in run
main_export(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/main.py", line 418, in main_export
onnx_export_from_model(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/convert.py", line 1186, in onnx_export_from_model
_, onnx_outputs = export_models(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/convert.py", line 770, in export_models
export(
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/convert.py", line 903, in export
config.fix_dynamic_axes(output, device=device, input_shapes=input_shapes, dtype=dtype)
File "/opt/conda/lib/python3.9/site-packages/optimum/exporters/onnx/base.py", line 235, in fix_dynamic_axes
session = InferenceSession(model_path.as_posix(), providers=providers, sess_options=session_options)
File "/opt/conda/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 419, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/opt/conda/lib/python3.9/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 480, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : Load model from hunyuan_tflite_float32/model.onnx failed:This is an invalid model. Type Error: Type 'tensor(int32)' of input parameter (/model/layers.31/mlp/up_proj/Expand_output_0) of operator (BitShift) in node (/model/layers.31/mlp/up_proj/BitShift) is invalid.

Urgency

No response

Target platform

onnxruntime

Build script

onnxruntime

Error / output

tencent/Hunyuan-1.8B-Instruct-GPTQ-Int4 transformation onnx

Visual Studio Version

No response

GCC / Compiler Version

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    buildbuild issues; typically submitted using template

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions