Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Removing ReLu activations brings the mAP of the final quantized model to zero! #2

Open
ambitious-octopus opened this issue Jul 24, 2024 · 2 comments

Comments

@ambitious-octopus
Copy link
Owner

ambitious-octopus commented Jul 24, 2024

We want to remove the ReLU operations in the Detect Head. However we got zero values for the mean Average Precision (mAP) after the quantization process if ReLU activations are removed.
It would be optimal to be able to export the original model without any extra activation introduced. Is there a specific explanation for this? It would certainly be useful to understand this behavior.

@Idan-BenAmi
Copy link

Yes, there is a specific explanation for this, but first let me comment that due to some recent improvements we've add to our tools this modification will not be necessary.

The reason, in short, for this additional ReLU, was to bound the quantization range to be always positive (since we know the coordinates are positive). But as I mentioned, in the next MCT version, this modification will not be necessary.

Thanks
Idan

Copy link

Stale issue message

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants