TL;DR Word embeddings lie at the heart of many generative AI (GenAI) applications. In Part 1 of this series, I showed how to construct word embeddings, starting from sparse embeddings that use the term-document and term-term matrices, or their weighted versions using TF-IDF and PMI, and going into learned dense embeddings that include static and contextual embeddings. This post is all about applications, so you can follow all of the calculations in this
Share this post
Word Embeddings Applications: Part 2
Share this post
TL;DR Word embeddings lie at the heart of many generative AI (GenAI) applications. In Part 1 of this series, I showed how to construct word embeddings, starting from sparse embeddings that use the term-document and term-term matrices, or their weighted versions using TF-IDF and PMI, and going into learned dense embeddings that include static and contextual embeddings. This post is all about applications, so you can follow all of the calculations in this