Continual Learning in Neural Networks: on Catastrophic Forgetting and Beyond [in Russian]
Slides:
Speaker: Polina Kirichenko, New York University
Learning new tasks continually without forgetting on a constantly changing data distribution is essential for real-world problems but is challenging for modern deep learning. Deep learning models suffer from catastrophic forgetting: when presented with a sequence of tasks, deep neural networks can successfully learn the new tasks, but the performance on the old tasks degrades.
In this talk, I will present an overview of the continual learning algorithms including well-established methods as well as recent state-of-the-art approaches. We will talk about several continual learning scenarios (task-, class-, and domain-incremental learning), review the most common approaches in alleviating forgetting and discuss other challenges in the field beyond catastrophic forgetting (including forward & backward transfer, learning on continuously drifting data and continua
2 views
613
152
1 month ago 08:29:28 1
Autumn Cozy Kitchen Ambience | Baking Sounds, Wood Burning Stove, Fall Season Birds, Crows
1 month ago 00:05:16 1
Pantera - Walk (Official Music Video) [4K]
1 month ago 00:29:24 1
CHOSEN ONES, Why Demons Are Disguised as Allies! (What You NEED to Know!)
1 month ago 00:51:17 1
“U.S. Will Default” - Sovereign Debt Collapse To Flip Global Monetary System | Andy Schectman
1 month ago 00:01:13 1
Last Night of Winter | Deluxe Edition | PS5
1 month ago 01:59:13 1
S4E73 | Jennifer Guskin - MK ULTRA & Sex Trafficking Survivor on Epstein, CPS & Witnessing Blackmail
1 month ago 00:27:30 1
Google CEO ERIC SCHMIDT BANNED Interview LEAKED: “Future is SCARY“ (AI Pep Talk)
1 month ago 03:13:11 1
Walking in the Rain Switzerland 4K: Rainy Day and Night - Heavy Rain in Interlaken and Lauterbrunnen