The interaction between coding and learning rules in neural nets (NNs), and between coding and genetic operators in genetic algorithms (GAs) is discussed. The underlying principle advocated is that similar things in "the world" should have similar codes. Similarity metrics are suggested for the coding of images and numerical quantities in neural nets, and for the coding of neural network structures in genetic algorithms. A principal component analysis of natural images yields receptive fields resembling horizontal and vertical edge and bar detectors. The orientation sensitivity of the "bar detector" components is found to match a psychophysical model, suggesting that the brain may make some use of principal components in its visual processing. Experiments are reported on the effects of different input and output codings on the accuracy of neural nets handling numeric data. It is found that simple analogue and interpolation codes are most successful. Experiments on the coding of image data demonstrate the sensitivity of final performance to the internal structure of the net. The interaction between the coding of the target problem and reproduction operators of mutation and recombination in GAs are discussed and illustrated. The possibilities for using GAs to adapt aspects of NNs are considered. The permutation problem, which affects attempts to use GAs both to train net weights and adapt net structures, is illustrated and methods to reduce it suggested. Empirical tests using a simulated net design problem to reduce evaluation times indicate that the permutation problem may not be as severe as has been thought, but suggest the utility of a sorting recombination operator, that matches hidden units according to the number of connections they have in common. A number of experiments using GAs to design network structures are reported, both to specify a net to be trained from random weights, and to prune a pre-trained net. Three different coding methods are tried, and various sorting recombination operators evaluated. The results indicate that appropriate sorting can be beneficial, but the effects are problem-dependent. It is shown that the GA tends to overfit the net to the particular set of test criteria, to the possible detriment of wider generalisation ability. A method of testing the ability of a GA to make progress in the presence of noise, by adding a penalty flag, is described.
Identifer | oai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:387350 |
Date | January 1993 |
Creators | Hancock, Peter J. B. |
Contributors | Smith, Leslie S. |
Publisher | University of Stirling |
Source Sets | Ethos UK |
Detected Language | English |
Type | Electronic Thesis or Dissertation |
Source | http://hdl.handle.net/1893/39 |
Page generated in 0.0104 seconds