We develop a theoretical analysis of generalization performances of regularized least-squares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact, a minimax analysis is performed which shows asymptotic optimality of the above-mentioned criterion.
Identifer | oai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/30539 |
Date | 14 April 2005 |
Creators | Caponnetto, Andrea, Vito, Ernesto De |
Source Sets | M.I.T. Theses and Dissertation |
Language | en_US |
Detected Language | English |
Format | 25 p., 16130108 bytes, 833989 bytes, application/postscript, application/pdf |
Relation | Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory |
Page generated in 0.0019 seconds