We show that recent results in [3] on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vector-valued regression setting. We first briefly introduce central concepts on operator-valued kernels. Then we show how risk bounds can be expressed in terms of a generalization of effective dimension.
Identifer | oai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/30543 |
Date | 16 May 2005 |
Creators | Vito, Ernesto De, Caponnetto, Andrea |
Source Sets | M.I.T. Theses and Dissertation |
Language | en_US |
Detected Language | English |
Format | 17 p., 12090406 bytes, 642646 bytes, application/postscript, application/pdf |
Relation | Massachusetts Institute of Technology Computer Science and Artificial Intelligence Laboratory |
Page generated in 0.0115 seconds