Lecture 2 of the online course Deep Learning Systems: Algorithms and Implementation.
This lecture covers a refresher of the basic principles of (supervised) machine learning, as exemplified by the softmax regression algorithm. We will go through the derivation of the softmax regression method and stochastic gradient descent applied to train this class of model.
Sign up for the course for free at .
Contents:
00:00 - Introduction
01:08 - Machine learning and data-driven programming
05:34 - Three ingredients of a machine learning algorithm
08:40 - Multi-class classification setting
12:04 - Linear hypothesis function
16:52 - Matrix batch notation
22:34 - Loss function #1: classification error
26:44 - Loss function #2: softmax / cross-entropy loss
35:28 - The softmax regression optimization problem
39:16 - Optimization: gradient descent
50:35 - Stochastic gradient descent
55:26 - The gradient of the softmax objective
1:08:16 - The slide I’m embarrassed to include...
1:16:49 - Putting it all together
2 views
5
2
5 months ago 00:39:00 1
2. Atomic Structure
5 months ago 02:34:09 1
Durood Shareef | Zikr Allah | Live With Shaykh Nurjan Sufi Meditation Center
5 months ago 01:24:29 2
Réunion Cesyp Les Messagers #09 22 05 2024
5 months ago 00:06:24 1
OUBLIE - MOI FOR EVER - PARODIE de Magnolias for ever
5 months ago 00:11:13 1
La CIA prouve que la manifestation est RÉELLE (Gateway Process)
5 months ago 02:32:55 1
The Global Historical Process | Mikhail Velichko
5 months ago 00:15:16 1
2.7 Гос вмешательство в работу рынка
6 months ago 04:19:56 1
SolidWorks Tutorial # 310: Robotic arm (layout design, mate controller)