When
Integrating Past and Present in Continual Learning
Abstract: Continual learning aims to bridge the gap between typical human and machine-learning environments. The continual setting does not have separate training and testing phases, and instead models are evaluated online while learning novel concepts. As in the real world, where the presence of spatiotemporal context helps us retrieve learned skills in the past, a realistic online learning setting also features an underlying context that changes throughout time. Object classes are correlated within a context and inferring the correct context can lead to better performance. I will describe a novel memory model that can make use of spatiotemporal information from the recent past. I will also present a unifying framework we have formulated for unsupervised continual learning, which disentangles learning objectives that are specific to the present and the past data. Finally we will consider open issues and challenges in continual learning.