Skip to content

[one-cmds] Allow to specialize shape of input/output tensors during ONNX-Circle conversion#13638

Draft
mbencer wants to merge 8 commits intoSamsung:masterfrom
mbencer:mbencer/OnnxDynShape
Draft

[one-cmds] Allow to specialize shape of input/output tensors during ONNX-Circle conversion#13638
mbencer wants to merge 8 commits intoSamsung:masterfrom
mbencer:mbencer/OnnxDynShape

Conversation

@mbencer
Copy link
Contributor

@mbencer mbencer commented Aug 9, 2024

This commit adds possibility to fix shape of dynamic inputs and outputs to one-import-onnx. What's more infer_shapes from onnx library can be used for shape inference support. In order to check shapes during tests circle-operator tool was extended.

ONE-DCO-1.0-Signed-off-by: Mateusz Bencer m.bencer@partner.samsung.com

Issue: #13636

This commit adds support of dynamic inputs and outputs to one-import-onnx.
As a result a user can specialize a model for specific input/output shapes.
What's more infer_shapes from onnx library can be used for shape inference support.
In order to check shapes during tests circle-operator tool was extended.

ONE-DCO-1.0-Signed-off-by: Mateusz Bencer <m.bencer@partner.samsung.com>
@mbencer mbencer requested a review from a team August 9, 2024 14:25
@mbencer mbencer changed the title [one-cmds] Add support of dynamic inputs/outputs to one-import-onnx [one-cmds] Allow to specialize shape of input/output tensors during ONNX-Circle conversion Aug 12, 2024
type=str,
help=
'Set static shape for output tensors in comma-separated list format, like \'[1,2,3]\'.'
'If the model has multiple inputs, tensor names should be provided as well.'
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
'If the model has multiple inputs, tensor names should be provided as well.'
'If the model has multiple outputs, tensor names should be provided as well.'

getattr(args, 'output_shapes'), output_shapes_map)
onnx_model = update_model_dims.update_inputs_outputs_dims(
onnx_model, input_shapes_map, output_shapes_map)
onnx_model = onnx.shape_inference.infer_shapes(onnx_model)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't we use strict_mode and handle errors?

https://onnx.ai/onnx/api/shape_inference.html

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good observation, but IHMO infer_shapes from onnx is something optional here. Even if something goes wrong we have still shape inference provided by ONE itself. My assumption was to treat it as an improvements for some edge cases ;-)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My point is about not just dying when something goes wrong, but recognizing and dealing with the situation. If someone is running a multi-step toolchain and it crashes, make sure that the person using the toolchain knows why it crashed and what to do about it.

Remember that not everyone using the toolchain is a brilliant programmer like you.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about enabling strict_mode, catching exception and printing warning if shapes calculated by onnx lib were not applied?

If someone is running a multi-step toolchain and it crashes, make sure that the person using the toolchain knows why it crashed and what to do about it.

I am also OK with stopping conversion if onnx shape inference fails. You are right that I haven't a big experience with the whole related toolchain ;)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about enabling strict_mode, catching exception and printing warning if shapes calculated by onnx lib were not applied?

Is it worth continuing execution of the toolchain after this? Is there any chance that the toolchain execution will end successfully?

If not, wouldn't it be better to display an error message explaining the problem and abort execution?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Completely yes. onnx.shape_inference.infer_shapes is not needed for models dedicated to be supported by this PR.

My proposition is to remove it now and add separately if really needed ;)

Copy link
Member

@lemmaa lemmaa Aug 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about enabling strict_mode, catching exception and printing warning if shapes calculated by onnx lib were not applied?

Is it worth continuing execution of the toolchain after this? Is there any chance that the toolchain execution will end successfully?

To clarify the understanding,

I agree with using strict_mode. However, when catching an exception, it is appropriate to terminate with an error rather than proceeding with a warning. Also, it is necessary to clearly inform the user of the cause of the error.

If you agree, please proceed carefully to avoid regression. :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, let's proceed it in this way ;)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lemmaa Is it ok for you now? Can I start creating PRs > ;-)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@seanshpark Can I start introduce this feature? Is the current design acceptable for you ;-) ?

@mbencer mbencer requested a review from lemmaa August 13, 2024 15:00
@mbencer mbencer requested a review from seanshpark August 28, 2024 09:18
@mbencer
Copy link
Contributor Author

mbencer commented Mar 13, 2025

Shape specialization can be done by circle-resizer

@mbencer mbencer closed this Mar 13, 2025
@mbencer mbencer deleted the mbencer/OnnxDynShape branch May 2, 2025 08:12
@lemmaa
Copy link
Member

lemmaa commented Jun 10, 2025

@mbencer , I think continuing with this PR might help resolve this problem. :)
Thank you in advance!

@mbencer mbencer reopened this Jun 10, 2025
@mbencer
Copy link
Contributor Author

mbencer commented Jun 10, 2025

@mbencer , I think continuing with this PR might help resolve this problem. :) Thank you in advance!

Ok, I've reopened the branch and synchronized it with the master. Let's continue discussion ;-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants