Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Keras BatchNorm folding bug fix #2584

Closed

Conversation

quic-ssayanta
Copy link
Contributor

@quic-ssayanta quic-ssayanta commented Dec 1, 2023

Fix for: #2586

Signed-off-by: Sayanta Mukherjee <quic_ssayanta@quicinc.com>
Signed-off-by: Sayanta Mukherjee <quic_ssayanta@quicinc.com>
… normal keras layer without the need of layer_input. So adding a check to sort the layer_input only if the layer is NOT a Lambda layer

Signed-off-by: Sayanta Mukherjee <quic_ssayanta@quicinc.com>
@quic-ssayanta quic-ssayanta self-assigned this Dec 1, 2023
@quic-ssayanta quic-ssayanta changed the title BatchNorm Fold Bug Fix Keras BatchNorm folding bug fix Dec 1, 2023
@quic-ssayanta quic-ssayanta deleted the Samsung_bn_fold_bug_fix branch December 1, 2023 14:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant