• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Tractable relaxations and efficient algorithmic techniques for large-scale optimization

Kilinc-Karzan, Fatma 21 June 2011 (has links)
In this thesis, we develop tractable relaxations and efficient algorithms for large-scale optimization. Our developments are motivated by a recent paradigm, Compressed Sensing (CS), which consists of acquiring directly low-dimensional linear projections of signals, possibly corrupted with noise, and then using sophisticated recovery procedures for signal reconstruction. We start by analyzing how to utilize a priori information given in the form of sign restrictions on part of the entries. We propose necessary and sufficient on the sensing matrix for exact recovery of sparse signals, utilize them in deriving error bounds under imperfect conditions, suggest verifiable sufficient conditions and establish their limits of performance. In the second part of this thesis, we study the CS synthesis problem -selecting the minimum number of rows from a given matrix, so that the resulting submatrix possesses certifiably good recovery properties. We express the synthesis problem as the problem of approximating a given matrix by a matrix of specified low rank in the uniform norm and develop a randomized algorithm for this problem. The third part is dedicated to efficient First-Order Methods (FOMs) for large-scale, well-structured convex optimization problems. We propose FOMs with stochastic oracles that come with exact guarantees on solution quality, achieve sublinear time behavior, and through extensive simulations, show considerable improvement over the state-of-the-art deterministic FOMs. In the last part, we examine a general sparse estimation problem -estimating a block sparse linear transform of a signal from the undersampled observations of the signal corrupted with nuisance and stochastic noise. We show that an extension of the earlier results to this more general framework is possible. In particular, we suggest estimators that have efficiently verifiable guaranties of performance and provide connections to well-known results in CS theory.

Page generated in 0.1225 seconds