Word Embeddings

  • 0 Rating
  • 0 Reviews
  • 141 Students Enrolled

Word Embeddings

Introduction to word embeddings methods: word2vec, GloVe, and BERT.

  • 0 Rating
  • 0 Reviews
  • 141 Students Enrolled
  • Free
Tags:
P2P



Courselet Content

2 components

Requirements

  • None

General Overview

Description

Word embeddings is a way to let the computer learn the meanings of words. We start from one-hot encoding representation and move on to an accidental yet very important finding of w2v. Then, we demonstrate the engineering thinking of GloVe. Finally, we briefly go through the corner stone of current research on word embeddings, BERT. 

Recommended for you

blog
Last Updated 8th March 2025
  • 0
blog
Last Updated 22nd September 2023
  • 4
  • Free
blog
Last Updated 19th July 2023
  • 0
  • 0
blog
Last Updated 16th June 2023
  • 0
blog
Last Updated 17th December 2022
  • 3
blog
Last Updated 16th January 2023
  • 1
  • Free
blog
Last Updated 7th January 2023
  • 4
  • Free
blog
Last Updated 16th January 2023
  • 4
  • Free
blog
Last Updated 23rd August 2024
  • 3
blog
Last Updated 27th February 2025
  • 3
  • Free
blog
Last Updated 7th November 2022
  • 10
  • Free
blog
Last Updated 7th November 2022
  • 144
  • Free

Meet the instructors !

instructor
About the Instructor

Research Interest:

  • Robust hedging and trading methods
  • Cryptocurrency derivatives
  • Quantitative finance 

Work in Progress

  • Hedging Cryptos with Bitcoin Futures
  • Crypto-backed Peer-to-Peer Lending
  • On dynamics of CP2P Interest Rate