Return to search

Memory and optimisation in neural network models

A numerical study of two classes of neural network models is presented. The performance of Ising spin neural networks as content-addressable memories for the storage of bit patterns is analysed. By studying systems of increasing sizes, behaviour consistent with fintite-size scaling, characteristic of a first-order phase transition, is shown to be exhibited by the basins of attraction of the stored patterns in the Hopfield model. A local iterative learning algorithm is then developed for these models which is shown to achieve perfect storage of nominated patterns with near-optimal content-addressability. Similar scaling behaviour of the associated basins of attraction is observed. For both this learning algorithm and the Hopfield model, by extrapolating to the thermodynamic limit, estimates are obtained for the critical minimum overlap which an input pattern must have with a stored pattern in order to successfully retrieve it. The role of a neural network as a tool for optimising cost functions of binary valued variables is also studied. The particular application considered is that of restoring binary images which have become corrupted by noise. Image restorations are achieved by representing the array of pixel intensities as a network of analogue neurons. The performance of the network is shown to compare favourably with two other deterministic methods-a gradient descent on the same cost function and a majority-rule scheme-both in terms of restoring images and in terms of minimising the cost function. All of the computationally intensive simulations exploit the inherent parallelism in the models: both SIMD (the ICL DAP) and MIMD (the Meiko Computing Surface) machines are used.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:384164
Date January 1988
CreatorsForrest, B. M.
PublisherUniversity of Edinburgh
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation

Page generated in 0.0049 seconds