Nonparametric convex regression has been extensively studied over the last two decades. It has been shown any Lipschitz convex function can be approximated with arbitrarily accuracy with a max of linear functions. Using this framework, in this thesis we generalize convex regression to learning an arbitrary Bregman divergence and learning a difference of convex functions. We provide approximation guarantees and sample complexity bounds for both these extensions. Furthermore, we provide algorithms to solve the resulting optimization problems based on 2-block alternative direction method of multipliers (ADMM). For this algorithm, we provide convergence guarantees with iteration complexity of O(nād/š) for a dataset X š ā^n,d and arbitrary positive š. Finally we provide experiments for both the Bregman divergence learning and difference of convex functions learning based on UCI datasets that demonstrate the state of the art on regression and classification datasets.
Identifer | oai:union.ndltd.org:bu.edu/oai:open.bu.edu:2144/43719 |
Date | 26 January 2022 |
Creators | Siahkamari, Ali |
Contributors | Kulis, Brian, Saligrama, Venkatesh |
Source Sets | Boston University |
Language | en_US |
Detected Language | English |
Type | Thesis/Dissertation |
Page generated in 0.0019 seconds