Skip to content

Exporting Gemma-3 models to ONNX is broken #45

@dvdplm

Description

@dvdplm

System Info

optimum v1.27.0, python 3.13, macos

Who can help?

@michaelbenayoun

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

Running

uv run optimum-cli export onnx --model google/gemma-3-4b-it-qat-q4_0-unquantized --task text-generation gemma-3-4b/

Results in the following error:

Traceback (most recent call last):
  File "/REDACTED_PATH_TO_PROJECT_DIR/.venv/bin/optimum-cli", line 10, in <module>
    sys.exit(main())
             ~~~~^^
  File "/REDACTED_PATH_TO_PROJECT_DIR/.venv/lib/python3.13/site-packages/optimum/commands/optimum_cli.py", line 208, in main
    service.run()
    ~~~~~~~~~~~^^
  File "/REDACTED_PATH_TO_PROJECT_DIR/.venv/lib/python3.13/site-packages/optimum/commands/export/onnx.py", line 276, in run
    main_export(
    ~~~~~~~~~~~^
        model_name_or_path=self.args.model,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<23 lines>...
        **input_shapes,
        ^^^^^^^^^^^^^^^
    )
    ^
  File "/REDACTED_PATH_TO_PROJECT_DIR/.venv/lib/python3.13/site-packages/optimum/exporters/onnx/__main__.py", line 418, in main_export
    onnx_export_from_model(
    ~~~~~~~~~~~~~~~~~~~~~~^
        model=model,
        ^^^^^^^^^^^^
    ...<19 lines>...
        **kwargs_shapes,
        ^^^^^^^^^^^^^^^^
    )
    ^
  File "/REDACTED_PATH_TO_PROJECT_DIR/.venv/lib/python3.13/site-packages/optimum/exporters/onnx/convert.py", line 1044, in onnx_export_from_model
    raise ValueError(
        f"Trying to export a {model_type} model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type {model_type} to be supported natively in the ONNX export."
    )
ValueError: Trying to export a gemma3 model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type gemma3 to be supported natively in the ONNX export.

Given #67 has been reported&fixed, I infer that my problem above is a proper bug.

Expected behavior

optimum-cli can export gemma-3 models to onnx

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions