SupportNet: solving catastrophic forgetting in class incremental learning with support data.
2018
A plain well-trained
deep learningmodel often does not have the ability to learn new knowledge without
forgettingthe previously learned knowledge, which is known as the catastrophic
forgetting. Here we propose a novel method, SupportNet, to solve the catastrophic
forgettingproblem in class
incremental learningscenario efficiently and effectively. SupportNet combines the strength of
deep learningand support vector machine (SVM), where SVM is used to identify the support data from the old data, which are fed to the
deep learningmodel together with the new data for further training so that the model can review the essential information of the old data when learning the new information. Two powerful consolidation regularizers are applied to ensure the robustness of the learned model. Comprehensive experiments on various tasks, including enzyme function prediction, subcellular structure classification and breast tumor classification, show that SupportNet drastically outperforms the state-of-the-art
incremental learningmethods and even reaches similar performance as the
deep learningmodel trained from scratch on both old and new data. Our program is accessible at: this https URL
Keywords:
-
Correction
-
Source
-
Cite
-
Save
65
References
22
Citations
NaN
KQI