• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Lower bounds in communication complexity and learning theory via analytic methods

Sherstov, Alexander Alexandrovich 23 October 2009 (has links)
A central goal of theoretical computer science is to characterize the limits of efficient computation in a variety of models. We pursue this research objective in the contexts of communication complexity and computational learning theory. In the former case, one seeks to understand which distributed computations require a significant amount of communication among the parties involved. In the latter case, one aims to rigorously explain why computers cannot master some prediction tasks or learn from past experience. While communication and learning may seem to have little in common, they turn out to be closely related, and much insight into both can be gained by studying them jointly. Such is the approach pursued in this thesis. We answer several fundamental questions in communication complexity and learning theory and in so doing discover new relations between the two topics. A consistent theme in our work is the use of analytic methods to solve the problems at hand, such as approximation theory, Fourier analysis, matrix analysis, and duality. We contribute a novel technique, the pattern matrix method, for proving lower bounds on communication. Using our method, we solve an open problem due to Krause and Pudlák (1997) on the comparative power of two well-studied circuit classes: majority circuits and constant-depth AND/OR/NOT circuits. Next, we prove that the pattern matrix method applies not only to classical communication but also to the more powerful quantum model. In particular, we contribute lower bounds for a new class of quantum communication problems, broadly subsuming the celebrated work by Razborov (2002) who used different techniques. In addition, our method has enabled considerable progress by a number of researchers in the area of multiparty communication. Second, we study unbounded-error communication, a natural model with applications to matrix analysis, circuit complexity, and learning. We obtain essentially optimal lower bounds for all symmetric functions, giving the first strong results for unbounded-error communication in years. Next, we resolve a longstanding open problem due to Babai, Frankl, and Simon (1986) on the comparative power of unbounded-error communication and alternation, showing that [mathematical equation]. The latter result also yields an unconditional, exponential lower bound for learning DNF formulas by a large class of algorithms, which explains why this central problem in computational learning theory remains open after more than 20 years of research. We establish the computational intractability of learning intersections of halfspaces, a major unresolved challenge in computational learning theory. Specifically, we obtain the first exponential, near-optimal lower bounds for the learning complexity of this problem in Kearns’ statistical query model, Valiant’s PAC model (under standard cryptographic assumptions), and various analytic models. We also prove that the intersection of even two halfspaces on {0,1}n cannot be sign-represented by a polynomial of degree less than [Theta](square root of n), which is an exponential improvement on previous lower bounds and solves an open problem due to Klivans (2002). We fully determine the relations and gaps among three key complexity measures of a communication problem: product discrepancy, sign-rank, and discrepancy. As an application, we solve an open problem due to Kushilevitz and Nisan (1997) on distributional complexity under product versus nonproduct distributions, as well as separate the communication classes PPcc and UPPcc due to Babai, Frankl, and Simon (1986). We give interpretations of our results in purely learning-theoretic terms. / text

Page generated in 0.0991 seconds