There are different types of word embeddings that can be broadly classified into two categories-
-
Frequency based Embedding : There are generally three types of vectors that we encounter under this category,
- Count Vector
- TF-IDF Vector
- Co-Occurrence Vector
-
Prediction based Embedding : Word2vec is not a single algorithm but a combination of two techniques,
- CBOW(Continuous bag of words)
- Skip-gram model.