This thesis investigates the way in which information about the structure of a set of training data with 'natural' characteristics may be used to positively influence the design of associative memory neural network models of the Hopfield type. This is done with a view to reducing the level of connectivity in models of this type. There are three strands to this work. Firstly, an empirical evaluation of the implementation of existing theory is given. Secondly, a number of existing theories are combined to produce novel network models and training regimes. Thirdly, new strategies for constructing and training associative memories based on knowledge of the structure of the training data are proposed. The first conclusion of this work is that, under certain circumstances, performance benefits may be gained by establishing the connectivity in a non-random fashion, guided by the knowledge gained from the structure of the training data. These performance improvements exist in relation to networks in which sparse connectivity is established in a purely random manner. This dilution occurs prior to the training of the network. Secondly, it is verified that, as predicted by existing theory, targeted post-training dilution of network connectivity provides greater performance when compared with networks in which connections are removed at random. Finally, an existing tool for the analysis of the attractor performance of neural networks of this type has been modified and improved. Furthermore, a novel, comprehensive performance analysis tool is proposed.
Identifer | oai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:275143 |
Date | January 2003 |
Creators | Turvey, Simon Paul |
Publisher | University of Hertfordshire |
Source Sets | Ethos UK |
Detected Language | English |
Type | Electronic Thesis or Dissertation |
Source | http://hdl.handle.net/2299/14113 |
Page generated in 0.0019 seconds