Introduction to word embeddings methods: word2vec, GloVe, and BERT.
Word embeddings is a way to let the computer learn the meanings of words. We start from one-hot encoding representation and move on to an accidental yet very important finding of w2v. Then, we demonstrate the engineering thinking of GloVe. Finally, we briefly go through the corner stone of current research on word embeddings, BERT.
Research Interest:
Work in Progress