Train floret embedding iteratively #11414
-
I'm trying to train a word embedding model using floret embeddings. However I have the constraint I have to train the model by batches of data (batches are big ~1M notes/batch). I have already trained a Word2Vec model using Gensim. Now I would like to train a floret model iteratively, but I haven't found any documentation neither the code in the repository seems to let train the model from pre-trained embeddings. Could someone help me find out if there is a way of training iteratively floret embeddings? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
There currently isn't a way to do this. The original fastText code doesn't support this, but you can find some related issues and open PRs with more info in the fastText github repo. |
Beta Was this translation helpful? Give feedback.
There currently isn't a way to do this. The original fastText code doesn't support this, but you can find some related issues and open PRs with more info in the fastText github repo.