• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A level set approach to integer nonlinear optimization

Hübner, Ruth 22 October 2013 (has links)
No description available.
2

CONTINUOUS RELAXATION FOR COMBINATORIAL PROBLEMS - A STUDY OF CONVEX AND INVEX PROGRAMS

Adarsh Barik (15359902) 27 April 2023 (has links)
<p>In this thesis, we study optimization problems which have a combinatorial aspect to them. Search space for such problems quickly grows large - exponentially - with respect to the problem dimension. Thus, exhaustive search becomes intractable and we need good relaxations to solve combinatorial problems efficiently. Another challenge arises due to the high dimensionality of such problems and lack of large number of samples. Our aim is to come up with innovative approaches that solve the problem in polynomial time and sample complexity. We discuss three combinatorial optimization problems and provide continuous relaxations for them. Our continuous relaxations involve both convex and nonconvex (invex) relaxations. Furthermore, we provide efficient first order algorithms to solve a general class of invex problems with provable convergence rate guarantees. The three combinatorial problems we study in this work are – learning the directed structure of a Bayesian network using blackbox data, fair sparse regression on a biased dataset where bias depends upon a hidden binary attribute and mixed linear regression. We propose convex relaxation for the first problem, while the other two are solved using invex relaxation. On the first problem, we come up with a novel notion of low rank representation of conditional probability tables for a Bayesian network and connect it to Fourier transformation of real valued set functions to recover the exact structure of the Bayesian networks. For the second problem, we propose a novel invex relaxation for the combinatorial version of sparse linear regression with fairness. For the final problem, we again use invex relaxation to learn a mixture of sparse linear regression models. We formally show correctness of our proposed methods and provide provable theoretical guarantees for efficient computational and sample complexity. We also develop efficient first order algorithms to solve invex problems. We provide convergence rate analysis for our proposed methods. Furthermore, we also discuss possible future research directions and the problems we want to tackle in future.</p>

Page generated in 0.1276 seconds