Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BatchNormalization throws error when change_ordering=True #172

Open
oconnor127 opened this issue Oct 23, 2023 · 0 comments
Open

BatchNormalization throws error when change_ordering=True #172

oconnor127 opened this issue Oct 23, 2023 · 0 comments

Comments

@oconnor127
Copy link

Hey,
I cannot convert a model which uses BatchNormalization Layers, because of dimension mismatches.. Assume the input tensor for BN is 48x112x112 (CHW) the parameters (e.g. gamma) has a size of 112, which is obviously wrong and should be 48 (HWC shape would be 112x112x48) ... However, until this problem is fixed properly, I circumvent it by modifying the onnx_to_keras() in converter.py
I added in line 229 (if layer['config'] and 'axis' in layer['config'])
if "epsilon" in layer['config']:
layer['config']['axis'][0] = 3
to swap the axis when its a BN layer (indicated by an "epsilon" or e.g. "gamma_initializer" , ... in the config)
Just to let people in future with the same problem know..

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant