Long Short-Term Memory (LSTM) - Sepp Hochreiter | Podcast #76
🧠 Science Academy:
🌎 Website:
📝 Sepp’s Diploma Thesis [German]: ~juergen/
🌎 More Info about Sepp: ~juergen/
📝 Deep Learning: Our Miraculous Year 1990-1991:
Sepp Hochreiter is heading the Institute for Machine Learning, the LIT AI Lab and the deep learning center at the Johannes Kepler University of Linz in Austria and is director of the Institute of Advanced Research in Artificial Intelligence (IARAI).Sepp Hochreiter is a pioneer of Deep Learning.
His contributions to the Long Short-Term Memory (LSTM) and the analysis of the vanishing gradient are viewed as milestones and key-moments of the history of both machine learning and Deep Learning. Sepp laid the foundations for Deep Learning in two ways. Dr. Hochreiter’s seminal works on the vanishing gradient and the Long Short-Term Memory (LSTM) were the starting points for what became later known as Deep Learning.
LSTM has been overwhelmingly successful in handwriting recognition, generation of writings, language modeling and identification, automatic language translation, speech recognition, analysis of audio data, as well as analysis, annotation, and description of video data.
ONLINE PRESENCE
================
🌍 My website -
💌 My weekly science newsletter -
📸 Instagram -
🐦 Twitter -
💻 Github:
SUPPORT MY WORK
=================
👕 Science Merch:
👉 Subscribe to my Science Blog: #/portal/signup/monthly
👉 Become my Patreon:
🧠 Subscribe for more free videos:
RECOMMENDATIONS
==================
🎥 1 Month Free of Skillshare Premium:
📕 Books on AI, Data Science & Programming: - use “podengineered20“ to get a 40% discount!
CONTACT:
-----------------
If you need help or have any questions or want to collaborate feel free to reach out to me via email: support@
#LSTM
#machinelearning
#deeplearning
Disclaimer: Some of these links are affiliate links which make me earn a small commission when you make a purchase at no additional cost to you.
TIME STAMPS
-----------------------
0:00 : Intro
3:20 : Sepp’s interest for Mathematics
6:48 : RNNs & LSTMs
10:55 : The Vanishing Gradient Problem in RNNs
14:25 : Blow Up Problem of RNNs
15:56 : Sepp’s thesis and working with Jürgen Schmidhuber
25:12 : LSTMs Solving Intractable Physics
27:45 : What is a LSTM?
28:49 : Where are LSTMs used?
32:20 : Hopfield Networks
35:36 : The Amazon Story - A Mojito for 1 Billion Dollars
38:18 : The Problem with AI Knowledge
41:52 : Talents moving out of Europe
46:37 : Will AGI come?
50:41 : How to become an AI expert
52:39 : Motivating words from Sepp
Podcast Recorded: January, 12th 2022 - Subscriber Release Count: 17,260.
1 view
4
0
2 months ago 00:19:17 1
Morihei Ueshiba - Asahi Shinbun Video (1935)
2 months ago 00:13:45 1
Culture Before Curriculum | Andrew Hammond | TEDxRoyalTunbridgeWells