[one-cmds] Allow to specialize shape of input/output tensors during ONNX-Circle conversion#13638
[one-cmds] Allow to specialize shape of input/output tensors during ONNX-Circle conversion#13638mbencer wants to merge 8 commits intoSamsung:masterfrom
Conversation
This commit adds support of dynamic inputs and outputs to one-import-onnx. As a result a user can specialize a model for specific input/output shapes. What's more infer_shapes from onnx library can be used for shape inference support. In order to check shapes during tests circle-operator tool was extended. ONE-DCO-1.0-Signed-off-by: Mateusz Bencer <m.bencer@partner.samsung.com>
compiler/one-cmds/one-import-onnx
Outdated
| type=str, | ||
| help= | ||
| 'Set static shape for output tensors in comma-separated list format, like \'[1,2,3]\'.' | ||
| 'If the model has multiple inputs, tensor names should be provided as well.' |
There was a problem hiding this comment.
| 'If the model has multiple inputs, tensor names should be provided as well.' | |
| 'If the model has multiple outputs, tensor names should be provided as well.' |
compiler/one-cmds/one-import-onnx
Outdated
| getattr(args, 'output_shapes'), output_shapes_map) | ||
| onnx_model = update_model_dims.update_inputs_outputs_dims( | ||
| onnx_model, input_shapes_map, output_shapes_map) | ||
| onnx_model = onnx.shape_inference.infer_shapes(onnx_model) |
There was a problem hiding this comment.
There was a problem hiding this comment.
Good observation, but IHMO infer_shapes from onnx is something optional here. Even if something goes wrong we have still shape inference provided by ONE itself. My assumption was to treat it as an improvements for some edge cases ;-)
There was a problem hiding this comment.
My point is about not just dying when something goes wrong, but recognizing and dealing with the situation. If someone is running a multi-step toolchain and it crashes, make sure that the person using the toolchain knows why it crashed and what to do about it.
Remember that not everyone using the toolchain is a brilliant programmer like you.
There was a problem hiding this comment.
How about enabling strict_mode, catching exception and printing warning if shapes calculated by onnx lib were not applied?
If someone is running a multi-step toolchain and it crashes, make sure that the person using the toolchain knows why it crashed and what to do about it.
I am also OK with stopping conversion if onnx shape inference fails. You are right that I haven't a big experience with the whole related toolchain ;)
There was a problem hiding this comment.
How about enabling
strict_mode, catching exception and printing warning if shapes calculated by onnx lib were not applied?
Is it worth continuing execution of the toolchain after this? Is there any chance that the toolchain execution will end successfully?
If not, wouldn't it be better to display an error message explaining the problem and abort execution?
There was a problem hiding this comment.
Completely yes. onnx.shape_inference.infer_shapes is not needed for models dedicated to be supported by this PR.
My proposition is to remove it now and add separately if really needed ;)
There was a problem hiding this comment.
How about enabling
strict_mode, catching exception and printing warning if shapes calculated by onnx lib were not applied?Is it worth continuing execution of the toolchain after this? Is there any chance that the toolchain execution will end successfully?
To clarify the understanding,
I agree with using strict_mode. However, when catching an exception, it is appropriate to terminate with an error rather than proceeding with a warning. Also, it is necessary to clearly inform the user of the cause of the error.
If you agree, please proceed carefully to avoid regression. :)
There was a problem hiding this comment.
Ok, let's proceed it in this way ;)
There was a problem hiding this comment.
@lemmaa Is it ok for you now? Can I start creating PRs > ;-)
There was a problem hiding this comment.
@seanshpark Can I start introduce this feature? Is the current design acceptable for you ;-) ?
|
Shape specialization can be done by circle-resizer |

This commit adds possibility to fix shape of dynamic inputs and outputs to one-import-onnx. What's more infer_shapes from onnx library can be used for shape inference support. In order to check shapes during tests circle-operator tool was extended.
ONE-DCO-1.0-Signed-off-by: Mateusz Bencer m.bencer@partner.samsung.com
Issue: #13636