Return to search

Reducing catastrophic forgetting in neural networks using slow learning

<p>This thesis describes a connectionist approach to learning and long-term memory consolidation, inspired by empirical studies on the roles of the hippocampus and neocortex in the brain. The existence of complementary learning systems is due to demands posed on our cognitive system because of the nature of our experiences. It has been shown that dual-network architectures utilizing information transfer successfully can avoid the phenomenon of catastrophic forgetting involved in multiple sequence learning. The experiments involves a Reverberated Simple Recurrent Network which is trained on multiple sequences with the memory being reinforced by means of self-generated pseudopatterns. My focus will be on the implications of how differentiated learning speed affects the level of forgetting, without explicit training on the data used to form the existing memory.</p>

Identiferoai:union.ndltd.org:UPSALLA/oai:DiVA.org:ntnu-8702
Date January 2006
CreatorsVik, Mikael Eikrem
PublisherNorwegian University of Science and Technology, Department of Computer and Information Science, Institutt for datateknikk og informasjonsvitenskap
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, text

Page generated in 0.0754 seconds