Return to search

Regularized Radial Basis Function Networks: Theory and Applications to Probability Estimation, Classification, and Time Series Prediction

<p>In this thesis, we study both theoretical and practical aspects of the regularized strict interpolation radial basis function (SIRBFN) estimate or neural network. From a theoretical perspective, we show that the regularized SIRBFN can be globally mean-square (m.s.) consistent whenever the Nadaraya-Watson regression estimate is and the regularization parameter sequence for the SIRBFN is chosen to be asymptotically optimal in the mean-squared fitting error. Hence we prove the Bayes risk consistency of the approximate Bayes decision rules formed from (m.s.-consistent) regularized SIRBFN posterior probability estimates. Similarly, we prove the m.s.-consistency of the regularized SIRBFN predictor for the class of Markovian nonlinear autoregressive time series generated by an i.i.d. noise process. In a one-step-ahead prediction experiment with a phonetically-balanced suite of male and female speech waveforms, the proposed predictor offers an average 2.2dB improvement in predction SNR over corresponding expponentially-weighted RLS predictors. We also show that linearly combining an ensemble of three such proposed predictors via RLS filtering can yield an average 4.2dB improvement over the previous standard RLS predictors, and develop recursive algorithms to update the proposed predictor on-line with reduced computational complexity for certain situations. Two emerging applications areas are then considered. The first is the regression-based approach to nonlinear filtering or state estimation, where the proposed network provides comparable performance to a recurrent MLP-based solution. The second is the dynamic reconstruction of chaotic systems from noisy observational data, where the reconstructed system is shown to generate sequences whose estimated long and short-term dynamical invariants agree closely with those of the original, noise-free system. Taken together, these theoretical and practical results point to the regularized SIRBFN as a principled design choice for RBF neural networks.</p> / Doctor of Philosophy (PhD)

Identiferoai:union.ndltd.org:mcmaster.ca/oai:macsphere.mcmaster.ca:11375/5719
Date January 1998
CreatorsYee, Van Paul
ContributorsHaykin, Simon, Electrical and Computer Engineering
Source SetsMcMaster University
Detected LanguageEnglish
Typethesis

Page generated in 0.0025 seconds