• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 196
  • 119
  • 72
  • 26
  • 16
  • 10
  • 10
  • 10
  • 7
  • 7
  • 4
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 529
  • 96
  • 88
  • 73
  • 70
  • 65
  • 61
  • 59
  • 55
  • 53
  • 45
  • 41
  • 38
  • 37
  • 34
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Linkage Based Dirichlet Processes

Song, Yuhyun 08 February 2017 (has links)
We live in the era of textit{Big Data} with significantly richer computational resources than the last two decades. The concurrence of computation resources and a large volume of data has boosted researchers' desire for developing feasible Markov Chain Monte Carlo (MCMC) algorithms for large parameter spaces. Dirichlet Process Mixture Models (DPMMs) have become a Bayesian mainstay for modeling heterogeneous structures, namely clusters, especially when the quantity of clusters is not known with the established MCMC methods. As opposed to many ad-hoc clustering methods, using Dirichlet Processes (DPs) in models provide a flexible and probabilistic approach for automatically estimating both cluster structure and quantity. While DPs are not fully parameterized, they depend on both a base measure and a concentration parameter that can heavily impact inferences. Determining the concentration parameter is critical and essential, since it adjusts the a-priori cluster expectation, but typical approaches for specifying this parameter are rather cavalier. In this work, we propose a new method for automatically and adaptively determining this parameter, which directly calibrates distances between clusters through an explicit link function within the DP. Furthermore, we extend our method to mixture models with Nested Dirichlet Processes (NDPs) that cluster the multilevel data and depend on the specification of a vector of concentration parameters. In this work, we detail how to incorporate our method in Markov chain Monte Carlo algorithms, and illustrate our findings through a series of comparative simulation studies and applications. / Ph. D. / We live in the era of <i>Big Data</i> with significantly richer computational resources than the last two decades. The concurrence of computational resources and a large volume of data has boosted researcher’s desire to develop the efficient Markov Chain Monte Carlo (MCMC) algorithms for models such as a Dirichlet process mixture model. The Dirichlet process mixture model has become more popular for clustering analyses because it provides a flexible and generative model for automatically defining both cluster structure and quantity. However, a clustering solution inferred by the Dirichlet process mixture model is impacted by the hyperparameters called a base measure and a concentration parameter. Determining the concentration parameter is critical and essential, since it adjusts the apriori cluster expectation, but typical approaches for specifying this parameter are rather cavalier. In this work, we propose a new method for automatically and adaptively determining this parameter, which directly calibrates distances between clusters. Furthermore, we extend our method to mixture models with Nested Dirichlet Processes (NDPs) that cluster the multilevel data and depend on the specification of a vector of concentration parameters. In this work, we have simulation studies to show the performance of the developed methods and applications such as modeling the timeline for building construction data and clustering the U.S median household income data. This work has contributions: 1) the developed methods in this work are straightforward to incorporate with any type of Monte Carlo Markov Chain algorithms, 2) methods calibrate with the probability distance between clusters and maximize the information based on the observations in defined clusters when estimating the concentration parameter, and 3) the methods can be extended to any type of the extension of Dirichlet processes, for instance, hierarchical Dirichlet processes or dependent Dirichlet processes.
12

Existence of a Sign-Changing Solution to a Superlinear Dirichlet Problem

Neuberger, John M. (John Michael) 08 1900 (has links)
We study the existence, multiplicity, and nodal structure of solutions to a superlinear elliptic boundary value problem. Under specific hypotheses on the superlinearity, we show that there exist at least three nontrivial solutions. A pair of solutions are of one sign (positive and negative respectively), and the third solution changes sign exactly once. Our technique is variational, i.e., we study the critical points of the associated action functional to find solutions. First, we define a codimension 1 submanifold of a Sobolev space . This submanifold contains all weak solutions to our problem, and in our case, weak solutions are also classical solutions. We find nontrivial solutions which are local minimizers of our action functional restricted to various subsets of this submanifold. Additionally, if nondegenerate, the one-sign solutions are of Morse index 1 and the sign-changing solution has Morse index 2. We also establish that the action level of the sign-changing solution is bounded below by the sum of the two lesser levels of the one-sign solutions. Our results extend and complement the findings of Z. Q. Wang ([W]). We include a small sample of earlier works in the general area of superlinear elliptic boundary value problems.
13

On the Dirichlet Prior and Bayesian Regularization

Steck, Harald, Jaakkola, Tommi S. 01 September 2002 (has links)
A common objective in learning a model from data is to recover its network structure, while the model parameters are of minor interest. For example, we may wish to recover regulatory networks from high-throughput data sources. In this paper we examine how Bayesian regularization using a Dirichlet prior over the model parameters affects the learned model structure in a domain with discrete variables. Surprisingly, a weak prior in the sense of smaller equivalent sample size leads to a strong regularization of the model structure (sparse graph) given a sufficiently large data set. In particular, the empty graph is obtained in the limit of a vanishing strength of prior belief. This is diametrically opposite to what one may expect in this limit, namely the complete graph from an (unregularized) maximum likelihood estimate. Since the prior affects the parameters as expected, the prior strength balances a "trade-off" between regularizing the parameters or the structure of the model. We demonstrate the benefits of optimizing this trade-off in the sense of predictive accuracy.
14

Ueber die Anwendung des Dirichletschen Prinzipes auf die Probleme der konformen Abbildung

Courant, Richard, January 1910 (has links)
Thesis (doctoral)--Georg-Augusts-Universität zu Göttingen, 1910. / Cover title. Vita. Includes bibliographical references.
15

Dirichlet series as a generalization of power series

Sanders, Hugh Alexander 08 1900 (has links)
No description available.
16

Two term class number formulae of Dirichlet type /

Godin, Shawn, January 1900 (has links)
Thesis (M.Sc.) - Carleton University, 2002. / Includes bibliographical references (p. 226-228). Also available in electronic format on the Internet.
17

Über irreguläre Potenzreihen und Dirichletsche Reihen.

Schnee, Walter, January 1908 (has links)
Thesis (doctoral)--Friedrich-Wilhelms-Universität zu Berlin, 1908. / Vita.
18

Conditions for the discreteness of the spectrum of singular elliptic operators

Hanerfeld, Harold. January 1963 (has links)
Thesis--University of California, Berkeley, 1963. / "UC-32 Mathematics and Computers" -t.p. "TID-4500 (19th Ed.)" -t.p. Includes bibliographical references (p. 45).
19

Regularity of differential forms minimizing degenerate elliptic functionals

Hamburger, Christoph. January 1989 (has links)
Thesis (doctoral)--Rheinische Friedrich-Wilhelms-Universität Bonn, 1989. / Includes bibliographical references.
20

Gluing spaces and analysis

Paulik, Gustav. January 2005 (has links)
Thesis (doctoral)--Rheinische Friedrich-Wilhelms-Universität Bonn, 2004. / Includes bibliographical references (p. 99-102).

Page generated in 0.049 seconds