NeurIPS 2021: An Infinite-Feature Extension for Bayesian ReLU Nets ...
--------------------------------------------------------------------------------
An Infinite-Feature Extension for Bayesian ReLU Nets That Fixes Their Asymptotic Overconfidence
Agustinus Kristiadi, Matthias Hein, and Philipp Hennig
Advances in Neural Information Processing Systems (NeurIPS) 2021
--------------------------------------------------------------------------------
► Paper:
► Code:
A Bayesian treatment can mitigate overconfidence in ReLU nets around the training data. But far away from them, ReLU Bayesian neural networks (BNNs) can still underestimate uncertainty and thus be asymptotically overconfident. This issue arises since the output variance of a BNN with finitely many features is quadratic in the distance from the data region. Meanwhile, Bayesian linear models with ReLU features converge, in the infinite-width limit, to a particular Gaussian process (GP) with a variance that grows cubically s
6 views
23
7
3 years ago 01:00:49 6
NeurIPS 2021 Conference Preview with Zeta Alpha
3 years ago 00:11:02 6
NeurIPS 2021: An Infinite-Feature Extension for Bayesian ReLU Nets ...
4 years ago 00:17:55 15
Глеб Моргачёв | Text2Bash: как мы участвовали в соревновании NeurIPS
3 years ago 00:08:12 1
SEAL: Self-supervised Embodied Active Learning (NeurIPS 2021)
3 years ago 00:07:03 4
NeurIPS 2021: Cockpit: A Practical Debugging Tool for Training Deep Neural Networks
4 years ago 01:30:47 14
Семинар 4: Обзор работ по обучению с подкреплением NeurIPS 2020 v2.0