-
Notifications
You must be signed in to change notification settings - Fork 228
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add anchor_generator, box_matcher and non_max_supression #1849
Conversation
Closed previous PR and made changes as per new keras-hub rename |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Minor comments. Also, a heads up @fchollet is quickly adding bounding box support to core Keras, soon we will be able to move our implementation, though some porting might be required to the final API Francois cooked up.
@mattdangerw where ever necessary if there are general |
Not sure I totally get the question, but in general, layers should make sure they are doing computation with the compute dtype. This will often happen automatically. Variables and inputs will automatically be cast to compute dtype in call. But when, say making a new array of floats inside call for whatever reason, that should usually be done with |
Thanks @mattdangerw for clarification, I wanted to know when we declare new arrays in layer call, clear now. |
Add below layers for RetinaNet