CS224W: Machine Learning with Graphs | 2021 | Lecture Walk Approaches for Node Embeddings
For more information about Stanford’s Artificial Intelligence professional and graduate programs, visit:
Jure Leskovec
Computer Science, PhD
In this video we look at a more effective similarity function - the probability of node co-occurrence in random walks on the graph. We introduce the intuition behind random walks, the objective function we will be optimizing, and how we can efficiently perform the optimization. We introduce node2vec, that combines BFS and DFS to generalize the concept of random walks.
To follow along with the course schedule and syllabus, visit:
0:00 Introduction
0:12 Notation
3:27 Random-Walk Embeddings
4:34 Why Random Walks?
5:18 Unsupervised Feature Learning
6:07 Feature Learning as Optimization
7:12 Random Walk Optimization
11:07 Negative Sampling
13:37 Stochastic Gradient Descent
15:59 Random Walks: Summary
16:49 How should we randomly walk?
17:29 Overview of nodezvec
19:41 BFS vs. DFS
19:57 Interpolating BFS and DFS
20:52 Biased Random Walks
23:47 nodezvec algorithm
24:50 Other Random Walk Ideas
25:46 Summary so far
1 view
34
11
2 years ago 00:16:47 1
CS224W: Machine Learning with Graphs | 2021 | Lecture 2.2 - Traditional Feature-based Methods: Link
2 years ago 00:20:10 1
CS224W: Machine Learning with Graphs | 2021 | Lecture 2.3 - Traditional Feature-based Methods: Graph
2 years ago 00:27:07 1
CS224W: Machine Learning with Graphs | 2021 | Lecture Walk Approaches for Node Embeddings