This thesis aims at developing efficient algorithms for solving some fundamental engineering problems in data science and machine learning. We investigate a variety of acceleration techniques for improving the convergence times of optimization algorithms. First, we investigate how problem structure can be exploited to accelerate the solution of highly structured problems such as generalized eigenvalue and elastic net regression. We then consider Anderson acceleration, a generic and parameter-free extrapolation scheme, and show how it can be adapted to accelerate practical convergence of proximal gradient methods for a broad class of non-smooth problems. For all the methods developed in this thesis, we design novel algorithms, perform mathematical analysis of convergence rates, and conduct practical experiments on real-world data sets. / <p>QC 20191105</p>
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:kth-263147 |
Date | January 2019 |
Creators | Van Mai, Vien |
Publisher | KTH, Reglerteknik, Stockholm |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Licentiate thesis, monograph, info:eu-repo/semantics/masterThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Relation | TRITA-EECS-AVL ; 2019:80 |
Page generated in 0.0018 seconds