论文标题

基于能量的潜在对准器用于增量学习

Energy-based Latent Aligner for Incremental Learning

论文作者

Joseph, K J, Khan, Salman, Khan, Fahad Shahbaz, Anwer, Rao Muhammad, Balasubramanian, Vineeth N

论文摘要

深度学习模型往往会忘记他们的早期知识,同时逐步学习新任务。之所以出现此行为,是因为针对新任务进行了优化的参数更新可能与适合旧任务的更新不符。由此产生的潜在表示不匹配导致忘记。在这项工作中,我们提出了ELI:基于能量的潜在对准器,用于增量学习,它首先了解了潜在表示的能量歧管,以便以前的任务潜伏能量低能量,而当前的任务潜伏能具有较高的能量值。该学到的流形用于应对在渐进学习过程中发生的代表性转变。我们提出的方法提供的隐式正则化可以用作现有增量学习方法中的插件模块。我们通过对CIFAR-100,ImageNet子集,ImageNet 1K和Pascal VOC数据集的广泛评估来验证这一点。当在多个增量设置中,将ELI添加到课堂学习中的三个突出方法中时,我们会观察到一致的改进。此外,当添加到最新的增量对象检测器中时,ELI可提供超过5%的检测准确性,从而证实了其有效性和对现有艺术的补充优势。

Deep learning models tend to forget their earlier knowledge while incrementally learning new tasks. This behavior emerges because the parameter updates optimized for the new tasks may not align well with the updates suitable for older tasks. The resulting latent representation mismatch causes forgetting. In this work, we propose ELI: Energy-based Latent Aligner for Incremental Learning, which first learns an energy manifold for the latent representations such that previous task latents will have low energy and the current task latents have high energy values. This learned manifold is used to counter the representational shift that happens during incremental learning. The implicit regularization that is offered by our proposed methodology can be used as a plug-and-play module in existing incremental learning methodologies. We validate this through extensive evaluation on CIFAR-100, ImageNet subset, ImageNet 1k and Pascal VOC datasets. We observe consistent improvement when ELI is added to three prominent methodologies in class-incremental learning, across multiple incremental settings. Further, when added to the state-of-the-art incremental object detector, ELI provides over 5% improvement in detection accuracy, corroborating its effectiveness and complementary advantage to existing art.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源