Return to search

Study on Least Trimmed Squares Artificial Neural Networks

In this thesis, we study the least trimmed squares artificial neural networks (LTS-ANNs), which are generalization of the least trimmed squares (LTS) estimators frequently used in robust linear parametric regression problems to nonparametric artificial neural networks (ANNs) used for nonlinear regression problems.
Two training algorithms are proposed in this thesis. The first algorithm is the incremental gradient descent algorithm. In order to speed up the convergence, the second training algorithm is proposed based on recursive least squares (RLS).
Three illustrative examples are provided to test the performances of robustness against outliers for the classical ANNs and the LTS-ANNs. Simulation results show that upon proper selection of the trimming constant of the learning machines, LTS-ANNs are quite robust against outliers compared with the classical ANNs.

Identiferoai:union.ndltd.org:NSYSU/oai:NSYSU:etd-0623108-160018
Date23 June 2008
CreatorsCheng, Wen-Chin
ContributorsChang-Hua Lien, Jeng-Yih Juang, Jer-Guang Hsieh, Tsu-Tian Lee, Jyh-Horng Jeng
PublisherNSYSU
Source SetsNSYSU Electronic Thesis and Dissertation Archive
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
Sourcehttp://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0623108-160018
Rightsunrestricted, Copyright information available at source archive

Page generated in 0.0022 seconds