• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • 1
  • Tagged with
  • 5
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Lower Bounds and Derandomization

Gardezi, Jaffer January 2008 (has links)
A major open problem in complexity theory is to determine whether randomized complexity classes such as BPP, AM, and MA have any nontrivial derandomization. This thesis investigates the derandomization of two randomized versions of the polynomial hierarchy.
2

Lower Bounds and Derandomization

Gardezi, Jaffer January 2008 (has links)
A major open problem in complexity theory is to determine whether randomized complexity classes such as BPP, AM, and MA have any nontrivial derandomization. This thesis investigates the derandomization of two randomized versions of the polynomial hierarchy.
3

Improved Bi-criteria Approximation for the All-or-Nothing Multicommodity Flow Problem in Arbitrary Networks

January 2020 (has links)
abstract: This thesis addresses the following fundamental maximum throughput routing problem: Given an arbitrary edge-capacitated n-node directed network and a set of k commodities, with source-destination pairs (s_i,t_i) and demands d_i> 0, admit and route the largest possible number of commodities -- i.e., the maximum throughput -- to satisfy their demands. The main contributions of this thesis are three-fold: First, a bi-criteria approximation algorithm is presented for this all-or-nothing multicommodity flow (ANF) problem. This algorithm is the first to achieve a constant approximation of the maximum throughput with an edge capacity violation ratio that is at most logarithmic in n, with high probability. The approach used is based on a version of randomized rounding that keeps splittable flows, rather than approximating those via a non-splittable path for each commodity: This allows it to work for arbitrary directed edge-capacitated graphs, unlike most of the prior work on the ANF problem. The algorithm also works if a weighted throughput is considered, where the benefit gained by fully satisfying the demand for commodity i is determined by a given weight w_i>0. Second, a derandomization of the algorithm is presented that maintains the same approximation bounds, using novel pessimistic estimators for Bernstein's inequality. In addition, it is shown how the framework can be adapted to achieve a polylogarithmic fraction of the maximum throughput while maintaining a constant edge capacity violation, if the network capacity is large enough. Lastly, one important aspect of the randomized and derandomized algorithms is their simplicity, which lends to efficient implementations in practice. The implementations of both randomized rounding and derandomized algorithms for the ANF problem are presented and show their efficiency in practice. / Dissertation/Thesis / Masters Thesis Computer Science 2020
4

Expandergraphen und Derandomisierung

Fetsch, Christian 20 October 2017 (has links)
In der vorliegenden Arbeit wird der Artikel 'Derandomizing the Ahlswede-Winter matrix-valued Chernoff bound using pessimistic estimators, and applications' von Wigderson und Xiao detailliert nachvollzogen. Es wird aufgezeigt, wie der Beweis der matrixwertigen Chernoff-Ungleichung von Ahlswede und Winter verläuft. Mit diesen Ergebnissen und der Methode der pessimistischen Schätzer wird schließlich der Beweis des Alon-Roichman-Theorems entrandomisiert.
5

Theoretical Aspects of Randomization in Computation

Vishnoi, Nisheeth Kumar 12 July 2004 (has links)
Randomness has proved to be a powerful tool in all of computation. It is pervasive in areas such as networking, machine learning, computer graphics, optimization, computational number theory and is "necessary" for cryptography. Though randomized algorithms and protocols assume access to "truly" random bits, in practice, they rely on the output of "imperfect" sources of randomness such as pseudo-random number generators or physical sources. Hence, from a theoretical standpoint, it becomes important to view randomness as a resource and to study the following fundamental questions pertaining to it: Extraction: How do we generate "high quality" random bits from "imperfect" sources? Randomization: How do we use randomness to obtain efficient algorithms? Derandomization: How (and when) can we "remove" our dependence on random bits? In this thesis, we consider important problems in these three prominent and diverse areas pertaining to randomness. In randomness extraction, we present extractors for "oblivious bit fixing sources". In (a non-traditional use of) randomization, we have obtained results in machine learning (learning juntas) and proved hardness of lattice problems. While in derandomization, we present a deterministic algorithm for a fundamental problem called "identity testing". In this thesis we also initiate a complexity theoretic study of Hilbert's 17th problem. Here identity testing is used in an interesting manner. A common theme in this work has been the use of tools from areas such as number theory in a variety of ways, and often the techniques themselves are quite interesting.

Page generated in 0.097 seconds