061: Interpolation, Extrapolation and Linearisation (Prof. Yann LeCun, Dr. Randall Balestriero)
We are now sponsored by Weights and Biases! Please visit our sponsor link:
Yann LeCun thinks that it’s specious to say neural network models are interpolating because in high dimensions, everything is extrapolation. Recently Dr. Randall Bellestrerio, Dr. Jerome Pesente and prof. Yann LeCun released their paper learning in high dimensions always amounts to extrapolation. This discussion has completely changed how we think about neural networks and their behaviour.
[00:00:00] Pre-intro
[00:11:58] Intro Part 1: On linearisation in NNs
[00:28:17] Intro Part 2: On interpolation in NNs
[00:47:45] Intro Part 3: On the curse
[00:57:41] LeCun intro
[00:58:18] Why is it important to distinguish between interpolation and extrapolation?
[01:03:18] Can DL models reason?
[01:06:23] The ability to change your mind
[01:07:59] Interpolation - LeCun steelman argument against NNs
[01:14:11] Should extrapolation be over all dimensions
[01:18:54] On the morphing of MNIST digits, is that interpolation?
[01
7 views
3
0
3 years ago 03:19:44 7
061: Interpolation, Extrapolation and Linearisation (Prof. Yann LeCun, Dr. Randall Balestriero)