Skip to content

Conversation

nicklasb
Copy link

@nicklasb nicklasb commented Jan 13, 2025

Without this, nn.Hardswish() will be selected in ConvBNLayer, which causes the opset version to be bumped to 14, in turn, and in my case, breaks ESP-DL ONNX quantization, which only supports 13.

I would also suspect that this is more correct also in other cases as one would not be likely to not want to use a specific activation function everywhere, would one supply it.

Also, it aligns with how the act-parameter is handled in other cases.

(accidentally first did the PR against a release branch)

Without this, nn.Hardswish() will be selected in ConvBNLayer, which causes the opset version to be bumped to 14, in turn breaking a ESP-DL onnx quantization, which only  supports 13. 

I would also suspect that this is more correct also in other cases as one would not be likely to not want to use a specific activation function everywhere, would one supply it. 

Also, it aligns with how the act-parameter is handled in other cases.
Copy link

paddle-bot bot commented Jan 13, 2025

Thanks for your contribution!

@CLAassistant
Copy link

CLAassistant commented Jan 13, 2025

CLA assistant check
All committers have signed the CLA.

Copy link

@simoberny simoberny left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These changes effectively allow passing the configured activation function. Without them, setting act=relu6 in LCPAN still results in a hardswish activation function in the exported ONNX.
I successfully exported an ONNX network with opset 11 for ESP-DL (which does not support hardswish activation and opset 14) using these changes.

@nicklasb
Copy link
Author

nicklasb commented Mar 2, 2025

@LokeZhou @jzhang533
This causes issues and probably make the output break several use cases, please include this change.

@nicklasb
Copy link
Author

@LokeZhou @jzhang533 isn't this an obvious bug? I have to correct this each time i use a new version of PaddleDetection?

@simoberny
Copy link

@LokeZhou @jzhang533 isn't this an obvious bug? I have to correct this each time i use a new version of PaddleDetection?

I agree

@nicklasb nicklasb requested a review from simoberny May 20, 2025 21:57
@nicklasb
Copy link
Author

nicklasb commented May 20, 2025

@zhangyubo0722 , @SigureMo , can you please review this?
It is very annoying to have to patch this every new version.

@simoberny Sorry, accidentally requested a review from you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants