Return to search

Neuron-adaptive neural network models and applications

Artificial Neural Networks have been widely probed by worldwide researchers to cope with the problems such as function approximation and data simulation. This thesis deals with Feed-forward Neural Networks (FNN's) with a new neuron activation function called Neuron-adaptive Activation Function (NAF), and Feed-forward Higher Order Neural Networks (HONN's) with this new neuron activation function. We have designed a new neural network model, the Neuron-Adaptive Neural Network (NANN), and mathematically proved that one NANN can approximate any piecewise continuous function to any desired accuracy. In the neural network literature only Zhang proved the universal approximation ability of FNN Group to any piecewise continuous function. Next, we have developed the approximation properties of Neuron Adaptive Higher Order Neural Networks (NAHONN's), a combination of HONN's and NAF, to any continuous function, functional and operator. Finally, we have created a software program called MASFinance which runs on the Solaris system for the approximation of continuous or discontinuous functions, and for the simulation of any continuous or discontinuous data (especially financial data). Our work distinguishes itself from previous work in the following ways: we use a new neuron-adaptive activation function, while the neuron activation functions in most existing work are all fixed and can't be tuned to adapt to different approximation problems; we only use on NANN to approximate any piecewise continuous function, while a neural network group must be utilised in previous research; we combine HONN's with NAF and investigate its approximation properties to any continuous function, functional, and operator; we present a new software program, MASFinance, for function approximation and data simulation. Experiments running MASFinance indicate that the proposed NANN's present several advantages over traditional neuron-fixed networks (such as greatly reduced network size, faster learning, and lessened simulation errors), and that the suggested NANN's can effectively approximate piecewise continuous functions better than neural networks groups. Experiments also indicate that NANN's are especially suitable for data simulation / Doctor of Philosophy (PhD)

Identiferoai:union.ndltd.org:ADTP/235438
Date January 1999
CreatorsXu, Shuxiang, University of Western Sydney, Faculty of Informatics, Science and Technology
Source SetsAustraliasian Digital Theses Program
LanguageEnglish
Detected LanguageEnglish
SourceTHESIS_FIST_XXX_Xu_S.xml

Page generated in 0.002 seconds