Modern data science applications demand solving large-scale optimization problems. The prevalent approaches are first-order methods, valued for their scalability. These methods are implemented to tackle highly irregular problems where assumptions of convexity and smoothness are untenable.
Seeking to deepen the understanding of these methods, we study first-order methods with constant step size for minimizing locally Lipschitz tame functions. To do so, we propose notions of discrete Lyapunov stability for optimization methods. Concerning common first-order methods, we provide necessary and sufficient conditions for stability. We also show that certain local minima can be unstable, without additional noise in the method. Our analysis relies on the connection between the iterates of the first-order methods and continuous-time dynamics.
Identifer | oai:union.ndltd.org:columbia.edu/oai:academiccommons.columbia.edu:10.7916/rp8t-sn05 |
Date | January 2024 |
Creators | Lai, Lexiao |
Source Sets | Columbia University |
Language | English |
Detected Language | English |
Type | Theses |
Page generated in 0.0019 seconds