You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey,
I cannot convert a model which uses BatchNormalization Layers, because of dimension mismatches.. Assume the input tensor for BN is 48x112x112 (CHW) the parameters (e.g. gamma) has a size of 112, which is obviously wrong and should be 48 (HWC shape would be 112x112x48) ... However, until this problem is fixed properly, I circumvent it by modifying the onnx_to_keras() in converter.py
I added in line 229 (if layer['config'] and 'axis' in layer['config'])
if "epsilon" in layer['config']:
layer['config']['axis'][0] = 3
to swap the axis when its a BN layer (indicated by an "epsilon" or e.g. "gamma_initializer" , ... in the config)
Just to let people in future with the same problem know..
The text was updated successfully, but these errors were encountered:
Hey,
I cannot convert a model which uses BatchNormalization Layers, because of dimension mismatches.. Assume the input tensor for BN is 48x112x112 (CHW) the parameters (e.g. gamma) has a size of 112, which is obviously wrong and should be 48 (HWC shape would be 112x112x48) ... However, until this problem is fixed properly, I circumvent it by modifying the onnx_to_keras() in converter.py
I added in line 229 (if layer['config'] and 'axis' in layer['config'])
if "epsilon" in layer['config']:
layer['config']['axis'][0] = 3
to swap the axis when its a BN layer (indicated by an "epsilon" or e.g. "gamma_initializer" , ... in the config)
Just to let people in future with the same problem know..
The text was updated successfully, but these errors were encountered: