Class-Incremental Learning with Rectified Feature-Graph Preservation

Cheng Hsun Lei, Yi Hsin Chen, Wen Hsiao Peng, Wei Chen Chiu*


研究成果: Conference contribution同行評審


In this paper, we address the problem of distillation-based class-incremental learning with a single head. A central theme of this task is to learn new classes that arrive in sequential phases over time while keeping the model’s capability of recognizing seen classes with only limited memory for preserving seen data samples. Many regularization strategies have been proposed to mitigate the phenomenon of catastrophic forgetting. To understand better the essence of these regularizations, we introduce a feature-graph preservation perspective. Insights into their merits and faults motivate our weighted-Euclidean regularization for old knowledge preservation. We further propose rectified cosine normalization and show how it can work with binary cross-entropy to increase class separation for effective learning of new classes. Experimental results on both CIFAR-100 and ImageNet datasets demonstrate that our method outperforms the state-of-the-art approaches in reducing classification error, easing catastrophic forgetting, and encouraging evenly balanced accuracy over different classes. Our project page is at :

主出版物標題Computer Vision – ACCV 2020 - 15th Asian Conference on Computer Vision, 2020, Revised Selected Papers
編輯Hiroshi Ishikawa, Cheng-Lin Liu, Tomas Pajdla, Jianbo Shi
發行者Springer Science and Business Media Deutschland GmbH
出版狀態Published - 2月 2021
事件15th Asian Conference on Computer Vision, ACCV 2020 - Virtual, Online
持續時間: 30 11月 20204 12月 2020


名字Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
12627 LNCS


Conference15th Asian Conference on Computer Vision, ACCV 2020
城市Virtual, Online


深入研究「Class-Incremental Learning with Rectified Feature-Graph Preservation」主題。共同形成了獨特的指紋。