Li, Xuan2022-06-252022-06-252022-06-24vt_gsexam:34838http://hdl.handle.net/10919/110936This thesis presents a novel strategy to address the challenge of "catastrophic forgetting" in deep continual-learning systems. The term refers to severe performance degradation for older tasks, as a system learns new tasks that are presented sequentially. Most previous techniques have emphasized preservation of existing knowledge while learning new tasks, in some cases advocating a memory buffer that grows in proportion to the number of tasks. However, we offer another perspective, which is that mitigating local-task fitness during learning is as important as attempting to preserve existing knowledge. We posit the existence of a consistent, unlabelled world environment that the system uses as an easily-accessible reference to avoid favoring spurious properties over more generalizable ones. Based on this assumption, we have developed a novel method called Learning with Reference (LwR), which delivers substantial performance gains relative to its state-of-the-art counterparts. The approach does not involve a growing memory buffer, and therefore promotes better performance at scale. We present extensive empirical evaluation on real-world datasets.ETDenIn CopyrightContinual LearningLifelong LearningClass-incremental LearningMemory EfficiencyReferencing Unlabelled World Data to Prevent Catastrophic Forgetting in Class-incremental LearningThesis