<p>Analyzing the ever-increasing data of unprecedented scale, dimensionality, diversity, and complexity poses considerable challenges to conventional approaches of statistical modeling. Bayesian nonparametrics constitute a promising research direction, in that such techniques can fit the data with a model that can grow with complexity to match the data. In this dissertation we consider nonparametric Bayesian modeling with completely random measures, a family of pure-jump stochastic processes with nonnegative increments. In particular, we study dictionary learning for sparse image representation using the beta process and the dependent hierarchical beta process, and we present the negative binomial process, a novel nonparametric Bayesian prior that unites the seemingly disjoint problems of count and mixture modeling. We show a wide variety of successful applications of our nonparametric Bayesian latent variable models to real problems in science and engineering, including count modeling, text analysis, image processing, compressive sensing, and computer vision.</p> / Dissertation
Identifer | oai:union.ndltd.org:DUKE/oai:dukespace.lib.duke.edu:10161/7204 |
Date | January 2013 |
Creators | Zhou, Mingyuan |
Contributors | Carin, Lawrence |
Source Sets | Duke University |
Detected Language | English |
Type | Dissertation |
Page generated in 0.0016 seconds