Gradients are Not All You Need (Machine Learning Research Paper Explained)
#deeplearning #backpropagation #simulation
More and more systems are made differentiable, which means that accurate gradients of these systems’ dynamics can be computed exactly. While this development has led to a lot of advances, there are also distinct situations where backpropagation can be a very bad idea. This paper characterizes a few such systems in the domain of iterated dynamical systems, often including some source of stochasticity, resulting in chaotic behavior. In these systems, it is often better to use black-box estimators for gradients than computing them exactly.
OUTLINE:
0:00 - Foreword
1:15 - Intro & Overview
3:40 - Backpropagation through iterated systems
12:10 - Connection to the spectrum of the Jacobian
15:35 - The Reparameterization Trick
21:30 - Problems of reparameterization
26:35 - Example 1: Policy Learning in Simulation
33:05 - Example 2: Meta-Learning Optimizers
36:15 - Example 3: Disk packing
37:45 - Analysis of Jacobians
40:20 - What can be done?
45:40 - Just use Black-Box meth
7 views
10
2
2 months ago 00:02:43 1
2025 NEW SUZUKI GSX-R1000 NEO RETRO UNVEILED!! READY TO BEAT YAMAHA XSR 900 GP
2 months ago 00:11:09 13
🥺 HOW TO DRAW EMOTIONS?? (level 1-3)
2 months ago 02:01:06 1
EPIC: The Musical Fan Compilation | Version 5 ()
2 months ago 00:06:14 1
Iris van Herpen ~ Meta Morphism
2 months ago 00:05:15 1
Triple Impossible Triangle | Illustrator Tutorial
2 months ago 00:13:28 6
Animated Camera Lens Blur Effect Tutorial in After Effects | Blur Video Effects
2 months ago 01:19:27 1
#AskTheTrainer | Ask Me Anything! | July 18th, 2024