Support vector regression has been applied to stock market forecasting problems. However, it is usually needed to tune manually the hyperparameters of the kernel functions. Multiple-kernel learning was developed to deal with this problem, by which the kernel matrix weights and Lagrange multipliers can be simultaneously derived through semideļ¬nite programming. However, the amount of time and space required is very demanding. We develop a two-stage multiple-kernel learning algorithm by incorporating sequential minimal optimization and the gradient projection method.
By this algorithm, advantages from different hyperparameter settings can be combined and overall system performance can be improved. Besides, the user need not specify the hyperparameter settings in advance, and trial-and-error for determining appropriate hyperparameter settings can then be avoided. Experimental results, obtained by running on datasets taken from Taiwan Capitalization Weighted Stock Index, show that our method performs better than other methods.
Identifer | oai:union.ndltd.org:NSYSU/oai:NSYSU:etd-0805109-121651 |
Date | 05 August 2009 |
Creators | Huang, Chi-wei |
Contributors | Chaur-Heh Hsieh, Chung-Ming Kuo, Wen-Yang Lin, Shie-Jue Lee, Tzung-Pei Hong |
Publisher | NSYSU |
Source Sets | NSYSU Electronic Thesis and Dissertation Archive |
Language | Cholon |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | http://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0805109-121651 |
Rights | unrestricted, Copyright information available at source archive |
Page generated in 0.0016 seconds