Theoretical Deep Learning. The Information Bottleneck method. Part 2
In this class we continue discussing how can we use the information bottleneck framework for neural networks study. In particular, we learn how to prevent NNs from overfitting by introducing a specific penalty term into the loss function, and reveal an objective very similar to evidence lower bound from bayesian statistics.
Find out more:
Our open-source framework to develop and deploy conversational assistants:
4 views
54
12
8 years ago 01:33:01 17
Theoretical neuroscience and deep learning theory Surya Ganguli
6 years ago 01:20:37 9
Theoretical Deep Learning. Spin-glass model
8 years ago 00:29:20 38
A Theoretical Framework for Deep Learning Networks (Tomaso Poggio - MIT)
5 years ago 00:31:06 17
Sanjeev Arora: Toward Theoretical Understanding of Deep Learning
5 years ago 00:29:00 4
Theoretical Deep Learning #2: Worst-case bounds. Part 4
5 years ago 01:07:56 7
Theoretical Deep Learning #2: PAC-bayesian bounds. Part5
6 years ago 01:37:11 10
Theoretical Deep Learning. The Information Bottleneck method. Part 1
5 years ago 01:15:16 4
Theoretical Deep Learning #2: Worst-case bounds. Part 2
5 years ago 01:26:21 9
Theoretical Deep Learning #2: Worst-case bounds. Part 3
6 years ago 01:37:56 4
Theoretical Deep Learning. The Information Bottleneck method. Part 2
5 years ago 01:04:42 7
One of Three Theoretical Puzzles: Generalization in Deep Networks
3 years ago 01:29:46 4
Theoretically Speaking — Deep Nets and the Primate Visual System
4 years ago 01:26:26 10
IACS Seminar: Deep Learning: Theoretical Building Blocks From The Study of Wide Networks 9/18/20
6 years ago 02:13:57 1
Sanjeev Arora: Toward Theoretical Understanding of Deep Learning (ICML 2018 tutorial)
3 years ago 00:20:41 7
Talk: Theoretical Aspects of Gradient Methods in Deep Learning
5 years ago 01:22:27 6
Margins, perceptrons, and deep networks - Matus Telgarsky
4 years ago 00:05:13 35
Theoretical - Cuttlefish [Neuropunk Records]
9 years ago 00:49:12 17
Naftali Thisby: Deep Neural Networks: An Information Theoretic Perspective
5 years ago 01:02:45 10
Geometric deep learning for functional protein design - Michael Bronstein
4 years ago 01:12:20 58
Theoretical Foundations of Graph Neural Networks
9 years ago 00:03:07 26
Commit: Theoretical Objective
5 years ago 01:02:04 3
Understanding Deep Neural Networks: From Generalization to Interpretability - Gitta Kutyniok
5 years ago 01:01:19 3
Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity - Guang Cheng
5 years ago 00:46:08 1
Gitta Kutyniok: “An Information Theoretic Approach to Validate Deep Learning-Based Algorithms“