Word Embeddings

  • 0 Rating
  • 0 Reviews
  • 7 Students Enrolled

Word Embeddings

Introduction to word embeddings methods: word2vec, GloVe, and BERT.

  • 0 Rating
  • 0 Reviews
  • 7 Students Enrolled
  • Free
Tags:
P2P



Courselet Content

1 courselets • 2 courselet components •

Requirements

  • None

Description

Word embeddings is a way to let the computer learn the meanings of words. We start from one-hot encoding representation and move on to an accidental yet very important finding of w2v. Then, we demonstrate the engineering thinking of GloVe. Finally, we briefly go through the corner stone of current research on word embeddings, BERT. 

Recently Added Courses/Courselets

About the Instructor

instructor
About the Instructor

Research Interest:

  • Robust hedging and trading methods
  • Cryptocurrency derivatives
  • Quantitative finance 

Work in Progress

  • Hedging Cryptos with Bitcoin Futures
  • Crypto-backed Peer-to-Peer Lending
  • On dynamics of CP2P Interest Rate