Word Embeddings

Word embeddings are a numeric representation of a word in a multi-dimensional vector space. They have captured the semantic relationship with all other words in the vocabulary.

In this video, you will explain what word embeddings are and why they are important. You will learn how build them using a co-occurrence matrix of raw counts or frequency.