Backpropagation calculus | Chapter 4, Deep learning
Help fund future projects:
An equally valuable form of support is to simply share some of the videos.
Special thanks to these supporters:
Written/interactive form of this series:
This one is a bit more symbol-heavy, and that’s actually the point. The goal here is to represent in somewhat more formal terms the intuition for how backpropagation works in part 3 of the series, hopefully providing some connection between that video and other texts/code that you come across later.
For more on backpropagation:
Music by Vincent Rubinetti:
------------------
Video timeline
0:00 - Introduction
0:38 - The Chain Rule in networks
3:56 - Computing relevant derivatives
4:45 - What do the derivatives mean?
5:39 - Sensitivity to weights/biases
6:42 - Layers with additional neurons
9:13 - Recap
------------------
3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you’re into that):
If you are new to this channel and want to see more, a good place to start is this playlist:
Various social media stuffs:
Website:
Twitter:
Patreon:
Facebook:
Reddit:
1 view
188
48
5 months ago 02:25:52 5
The spelled-out intro to neural networks and backpropagation: building micrograd
10 months ago 04:39:50 1
Mathematics of neural network
1 year ago 00:54:51 1
How to Create a Neural Network (and Train it to Identify Doodles)
1 year ago 00:10:18 1
Backpropagation calculus | Chapter 4, Deep learning
2 years ago 00:55:30 1
Applied Deep Learning 2022 - Lecture 2 - Neural Networks, Optimization and Backpropagation
3 years ago 00:06:06 4
Backpropagation — Topic 79 of Machine Learning Foundations