Dive Into Deep Learning, Lecture 2: PyTorch Automatic Differentiation ( and backward)
In this video, we discuss PyTorch’s automatic differentiation engine that powers neural networks and deep learning training (for stochastic gradient descent). In this section, you will get a conceptual understanding of how autograd works to find the gradient of multivariable functions. We start by discussing derivatives, partial derivatives, and the definition of gradients. We then discuss how to compute gradients using requires_grad=True and the backward() method. Thus, we cover classes and functions implementing automatic differentiation of arbitrary scalar-valued and non-scalar-valued functions. We also discuss the Jacobian matrix in PyTorch. Differentiation is a crucial step in nearly all machine learning and deep learning optimization algorithms. While the calculations for taking these derivatives are straightforward, working out the updates by hand can be a painful and tedious task.
#Autograd #PyTorch #DeepLearning
1 view
1351
505
7 months ago 00:58:15 1
Melodic Techno & Progressive House MIx 2024 | Fred again • Argy • Monolink
7 months ago 00:02:10 1
33 Immortals - Gameplay Trailer (ESRB)
7 months ago 00:04:36 1
Nordvpn Review 🤑 What is the Most Safe VPN in The World?
7 months ago 00:15:40 2
Blind Test: 90s Techno / Acid Edition Part Two - Episode 33 (Electronic Beats TV)
7 months ago 00:04:25 1
5 Health Benefits of Peppermint Tea That Will Change Your Life!
7 months ago 00:04:58 1
DJ Dado - Dreaming (M-Kay Musique Remix)
7 months ago 00:21:31 1
Unconventional Flower Painting: 7 Different Acrylic Pouring Techniques ~ Fluid Art Compilation