In this thesis, we study the least trimmed squares artificial neural networks (LTS-ANNs), which are generalization of the least trimmed squares (LTS) estimators frequently used in robust linear parametric regression problems to nonparametric artificial neural networks (ANNs) used for nonlinear regression problems.
Two training algorithms are proposed in this thesis. The first algorithm is the incremental gradient descent algorithm. In order to speed up the convergence, the second training algorithm is proposed based on recursive least squares (RLS).
Three illustrative examples are provided to test the performances of robustness against outliers for the classical ANNs and the LTS-ANNs. Simulation results show that upon proper selection of the trimming constant of the learning machines, LTS-ANNs are quite robust against outliers compared with the classical ANNs.
Identifer | oai:union.ndltd.org:NSYSU/oai:NSYSU:etd-0623108-160018 |
Date | 23 June 2008 |
Creators | Cheng, Wen-Chin |
Contributors | Chang-Hua Lien, Jeng-Yih Juang, Jer-Guang Hsieh, Tsu-Tian Lee, Jyh-Horng Jeng |
Publisher | NSYSU |
Source Sets | NSYSU Electronic Thesis and Dissertation Archive |
Language | English |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0623108-160018 |
Rights | unrestricted, Copyright information available at source archive |
Page generated in 0.0016 seconds