Return to search

Fast Rates for Regularized Least-squares Algorithm

We develop a theoretical analysis of generalization performances of regularized least-squares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact, a minimax analysis is performed which shows asymptotic optimality of the above-mentioned criterion.

Identiferoai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/30539
Date14 April 2005
CreatorsCaponnetto, Andrea, Vito, Ernesto De
Source SetsM.I.T. Theses and Dissertation
Languageen_US
Detected LanguageEnglish
Format25 p., 16130108 bytes, 833989 bytes, application/postscript, application/pdf
RelationMassachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory

Page generated in 0.002 seconds