Ensemble learning is a multiple-classi& / #64257 / er machine learning approach which combines, produces collections and ensembles statistical classi& / #64257 / ers to build up more accurate classi& / #64257 / er than the individual classi& / #64257 / ers. Bagging, boosting and voting methods are the basic examples of ensemble learning. In this thesis, a novel boosting technique targeting to solve partial problems of AdaBoost, a well-known boosting algorithm, is proposed. The proposed systems & / #64257 / nd an elegant way of boosting a bunch of classi& / #64257 / ers successively to form a better classi& / #64257 / er than each ensembled classi& / #64257 / er. AdaBoost algorithm employs a greedy search over hypothesis space to & / #64257 / nd a good suboptimal solution. On the other hand, this work proposes an evolutionary search with genetic algorithms instead of greedy search. Empirical results show that classi& / #64257 / cation with boosted evolutionary computing outperforms AdaBoost in equivalent experimental environments.
Identifer | oai:union.ndltd.org:METU/oai:etd.lib.metu.edu.tr:http://etd.lib.metu.edu.tr/upload/3/12608432/index.pdf |
Date | 01 June 2007 |
Creators | Yalabik, Ismet |
Contributors | Yarman-vural, Fatos Tunay |
Publisher | METU |
Source Sets | Middle East Technical Univ. |
Language | English |
Detected Language | English |
Type | M.S. Thesis |
Format | text/pdf |
Rights | To liberate the content for public access |
Page generated in 0.0124 seconds