Skip to content

Commit

Permalink
Remove Keras dependency (#2937)
Browse files Browse the repository at this point in the history
* remove keras dependency
- relegate keras exports to FAQ: https://github.com/RaRe-Technologies/gensim/wiki/Recipes-&-FAQ#q13-how-do-i-export-a-trained-word2vec-model-to-keras

* remove forgotten notebook with keras

* Update CHANGELOG.md

Co-authored-by: Michael Penkov <m@penkov.dev>
  • Loading branch information
piskvorky and mpenkov authored Sep 10, 2020
1 parent 9cd72f5 commit bb947b3
Show file tree
Hide file tree
Showing 5 changed files with 2 additions and 472 deletions.
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ This release contains a major refactoring.
* No more wheels for x32 platforms (if you need x32 binaries, please build them yourself).
(__[menshikh-iv](https://github.com/menshikh-iv)__, [#6](https://github.com/RaRe-Technologies/gensim-wheels/pull/6))
* Speed up random number generation in word2vec model (PR [#2864](https://github.com/RaRe-Technologies/gensim/pull/2864), __[@zygm0nt](https://github.com/zygm0nt)__)
* Remove Keras dependency (PR [#2937](https://github.com/RaRe-Technologies/gensim/pull/2937), __[@piskvorky](https://github.com/piskvorky)__)

### :books: Tutorial and doc improvements

Expand Down
273 changes: 0 additions & 273 deletions docs/notebooks/keras_wrapper.ipynb

This file was deleted.

41 changes: 1 addition & 40 deletions gensim/models/keyedvectors.py
Original file line number Diff line number Diff line change
Expand Up @@ -1588,55 +1588,16 @@ def intersect_word2vec_format(self, fname, lockf=0.0, binary=False, encoding='ut
self.vectors_lockf[self.get_index(word)] = lockf # lock-factor: 0.0=no changes
logger.info("merged %d vectors into %s matrix from %s", overlap_count, self.wv.vectors.shape, fname)

def get_keras_embedding(self, train_embeddings=False):
"""Get a Keras 'Embedding' layer with weights set as the Word2Vec model's learned word embeddings.
Parameters
----------
train_embeddings : bool
If False, the weights are frozen and stopped from being updated.
If True, the weights can/will be further trained/updated.
Returns
-------
`keras.layers.Embedding`
Embedding layer.
Raises
------
ImportError
If `Keras <https://pypi.org/project/Keras/>`_ not installed.
Warnings
--------
Current method work only if `Keras <https://pypi.org/project/Keras/>`_ installed.
"""
try:
from keras.layers import Embedding
except ImportError:
raise ImportError("Please install Keras to use this function")
weights = self.vectors

# set `trainable` as `False` to use the pretrained word embedding
# No extra mem usage here as `Embedding` layer doesn't create any new matrix for weights
layer = Embedding(
input_dim=weights.shape[0], output_dim=weights.shape[1],
weights=[weights], trainable=train_embeddings
)
return layer

def _upconvert_old_d2vkv(self):
"""Convert a deserialized older Doc2VecKeyedVectors instance to latest generic KeyedVectors"""

self.vocab = self.doctags
self._upconvert_old_vocab() # destroys 'vocab', fills 'key_to_index' & 'extras'
for k in self.key_to_index.keys():
old_offset = self.get_vecattr(k, 'offset')
true_index = old_offset + self.max_rawint + 1
self.key_to_index[k] = true_index
del self.expandos['offset'] # no longer needed
if(self.max_rawint > -1):
if self.max_rawint > -1:
self.index_to_key = list(range(0, self.max_rawint + 1)) + self.offset2doctag
else:
self.index_to_key = self.offset2doctag
Expand Down
Loading

0 comments on commit bb947b3

Please sign in to comment.