You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We want to remove the ReLU operations in the Detect Head. However we got zero values for the mean Average Precision (mAP) after the quantization process if ReLU activations are removed.
It would be optimal to be able to export the original model without any extra activation introduced. Is there a specific explanation for this? It would certainly be useful to understand this behavior.
The text was updated successfully, but these errors were encountered:
Yes, there is a specific explanation for this, but first let me comment that due to some recent improvements we've add to our tools this modification will not be necessary.
The reason, in short, for this additional ReLU, was to bound the quantization range to be always positive (since we know the coordinates are positive). But as I mentioned, in the next MCT version, this modification will not be necessary.
We want to remove the ReLU operations in the Detect Head. However we got zero values for the mean Average Precision (mAP) after the quantization process if ReLU activations are removed.
It would be optimal to be able to export the original model without any extra activation introduced. Is there a specific explanation for this? It would certainly be useful to understand this behavior.
The text was updated successfully, but these errors were encountered: