C1. Implement Natural Language Processing for Word Embedding (Axel Sirota, 2022)

1. Course Overview: 1. Course Overview 00:00:00 2. Why Process Text: 1. Why Should We Process Text 00:01:40 2. Demo - Introducing Globomantics Case Study 00:04:22 3. Getting the Best out of This Course 00:05:25 4. Version Check 00:08:03 5. Outline of the Course 00:10:18 3. Training Word Representations: 1. How to Represent Words 00:11:47 2. First Embedding - One Hot Encoding 00:14:50 3. Demo - Using OHE 00:17:31 4. Demo - Analyzing Sentiment with OHE 00:27:52 5. Training Embeddings with Networks - CBOW and Skip-gram 00:34:59 6. Demo - Training a CBOW Embedding 00:40:35 7. Demo - Reanalyze Sentiment with a Network-based Embedding 00:53:42 8. What Comes Next 01:01:06 4. Fine-tuning Word Representations: 1. Why Would We Fine Tune Existing Models 01:03:05 2. Demo - Fine Tuning Glove and FastText 01:05:13 3. Demo - Making Word Clusters 01:16:19 4. Demo - Debiase Word Embeddings 01:23:00 5. Key Takeaways and Tips 01:31:13 6. Where to Go Next 01:32:14
Back to Top