Incremental similarity learning for knowledge retention in deep neural networks
Date
2022
Authors
Huo, Jiahao
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The study investigate the open research problem of overcoming catastrophic forgetting during incremental similarity learning. The study examines the extent of forgetting during regular training on four well-known similarity learning loss functions: angular, contrastive, triplet, and center loss. We adapted three state-of-the-art existing learning techniques for incremental classification learning to incremental similarity learning. Furthermore, we present our novel incremental learning technique inspired by previous research. We compared our approach to the three existing techniques. Finally, we examine the forgetting problem on a variety of simple to complex datasets. We looked into different architecture setups to comprehensively examine how the different methods fare. The results confirm that catastrophic forget-ting does occur during incremental similarity learning. Furthermore, we have shown the importance of good mining techniques for the similarity learning loss functions to reduce catastrophic forgetting. The results show that our approach outperformed the three existing techniques on average for incremental similarity learning. In addition, we note that our technique retained the highest ratio of base knowledge regardless of the training setup with the different datasets or network architectures. However, other methods learn new knowledge better. Nevertheless, our approach yields better average knowledge performance across all experiments. Furthermore, our approach can improve performance on unseen classes during incremental similarity learning. The results show that regularization techniques do not work as well as exemplar retention techniques in most of our experiments. In addition, we have shown that different combinations of training setups affect how each of the different techniques effectively reduces catastrophic forgetting. Further investigation into the relationship between transfer learning and similarity learning and the protection of intermediate layer embedding space for catastrophic forgetting is required.
Description
A research report submitted to the School of Computer Science and Applied Mathematics, Faculty of Science, University of Witwatersrand, in partial fulfilment of the requirements for the degree Master of Science (in Computer Science,) 2022