ContinualAI Seminars: “Reducing Representation Drift in Online Continual Learning“

Continual Learning Seminar: “Reducing Representation Drift in Online Continual Learning“ Abstract: In the online continual learning paradigm, agents must learn from a changing distribution while respecting memory and compute constraints. Previous work in this setting often tries to reduce catastrophic forgetting by limiting changes in the space of model parameters. In this work we instead focus on the change in representations of observed data that arises when previously unobserved classes appear in the incoming data stream, and new classes must be distinguished from previous ones. Starting from a popular approach, experience replay, we consider metric learning based loss functions which, when adjusted to appropriately select negative samples, can effectively nudge the learned representations to be more robust to new future classes. We show that this selection of negatives is in fact critical for reducing representation drift of previously observed data. Moti
Back to Top