Embeddings in Natural Language Processing
Autor
Mohammad Taher Pilehvar
, Jose Camacho-Collados
Provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings.