Kernel methods are a well-studied approach for addressing regression problems by implicitly mapping input variables into possibly infinite-dimensional feature spaces, particularly in cases where standard linear regression fails to capture non-linear relationships in data. Therefore, the choice between standard linear regression and kernel regression can be seen as a tradeoff between constraints on the number of features and the number of training samples. Our results show that the Gaussian kernel consistently achieves the lowest mean squared error for the largest considered training size. At the same time, the standard ridge regression exhibits a higher mean squared error and lower fit time. We have proven algebraically that the solutions of standard ridge regression and kernel ridge regression are mathematically equivalent.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:lnu-126158 |
Date | January 2023 |
Creators | Rossmann, Tom Lennart |
Publisher | Linnéuniversitetet, Institutionen för matematik (MA) |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.0019 seconds