Return to search

Incremental Learning With Sample Generation From Pretrained Networks

abstract: In the last decade deep learning based models have revolutionized machine learning and computer vision applications. However, these models are data-hungry and training them is a time-consuming process. In addition, when deep neural networks are updated to augment their prediction space with new data, they run into the problem of catastrophic forgetting, where the model forgets previously learned knowledge as it overfits to the newly available data. Incremental learning algorithms enable deep neural networks to prevent catastrophic forgetting by retaining knowledge of previously observed data while also learning from newly available data.

This thesis presents three models for incremental learning; (i) Design of an algorithm for generative incremental learning using a pre-trained deep neural network classifier; (ii) Development of a hashing based clustering algorithm for efficient incremental learning; (iii) Design of a student-teacher coupled neural network to distill knowledge for incremental learning. The proposed algorithms were evaluated using popular vision datasets for classification tasks. The thesis concludes with a discussion about the feasibility of using these techniques to transfer information between networks and also for incremental learning applications. / Dissertation/Thesis / Masters Thesis Computer Science 2020

Identiferoai:union.ndltd.org:asu.edu/item:57207
Date January 2020
ContributorsPatil, Rishabh (Author), Venkateswara, Hemanth (Advisor), Panchanathan, Sethuraman (Advisor), McDaniel, Troy (Committee member), Arizona State University (Publisher)
Source SetsArizona State University
LanguageEnglish
Detected LanguageEnglish
TypeMasters Thesis
Format55 pages
Rightshttp://rightsstatements.org/vocab/InC/1.0/

Page generated in 0.0024 seconds