Return to search

Study on Additive Generalized Radial Basis Function Networks

In this thesis, we propose a new class of learning models, namely the additive generalized radial basis function networks (AGRBFNs), for general nonlinear regression problems. This class of learning machines combines the generalized radial basis function networks (GRBFNs) commonly used in general machine learning problems and the additive models (AMs) frequently encountered in semiparametric regression problems. In statistical regression theory, AM is a good compromise between the linear model and the nonparametric model. In order for more general network structure hoping to address more general data sets, the AMs are embedded in the output layer of the GRBFNs to form the AGRBFNs. Simple weights updating rules based on incremental gradient descent will be derived. Several illustrative examples are provided to compare the performances for the classical GRBFNs and the proposed AGRBFNs. Simulation results show that upon proper selection of the hidden nodes and the bandwidth of the kernel smoother used in additive output layer, AGRBFNs can give better fits than the classical GRBFNs. Furthermore, for the given learning problem, AGRBFNs usually need fewer hidden nodes than those of GRBFNs for the same level of accuracy.

Identiferoai:union.ndltd.org:NSYSU/oai:NSYSU:etd-0618109-174414
Date18 June 2009
CreatorsLiao, Shih-hui
ContributorsKer-Wei Yu, Shiang-Hwua Yu, Yih-Lon Lin, Jer-Guang Hsieh
PublisherNSYSU
Source SetsNSYSU Electronic Thesis and Dissertation Archive
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
Sourcehttp://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0618109-174414
Rightswithheld, Copyright information available at source archive

Page generated in 0.0021 seconds