Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about saving the model weights in int #16

Open
reiinakano opened this issue Jan 5, 2019 · 3 comments
Open

Question about saving the model weights in int #16

reiinakano opened this issue Jan 5, 2019 · 3 comments

Comments

@reiinakano
Copy link

reiinakano commented Jan 5, 2019

I've been playing around a bit with the world models code and using it in my own applications, and I noticed that your model saving code converts the float weights to integers by multiplying by 10000. Do you have any special reason for saving to ints and the 10000 value? I find that in some cases, my model (where I've adapted the json saving code from here) screws up when I reload it from the saved weights. I'm suspecting it's because of this loss in precision but I'm not 100% sure. Just wondering if you've experienced any weirdness from this in the past.

@hardmaru
Copy link
Contributor

hardmaru commented Jan 9, 2019

To decrease the actual size of the model served over the web, I quantized the weights to 4 significant digits by dumping the values originally to a json INT16 array (that is base64 coded). When I load them to tensorflow.js (well, it's a really old version of it when it was still called deeplearn.js), I converted it back to floats by dividing each value by 10000.

I think with the newer tools available in tf.js, there are now much better ways to serve models efficiently over the web. I think there are plans to have these quantization schemes in the library.

@reiinakano
Copy link
Author

Ah, that makes sense. I will stick with the original floats for stopping and restarting long running training then. Thanks!

@hardmaru
Copy link
Contributor

hardmaru commented Jan 9, 2019 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants