• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • Tagged with
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Finding the Maximizers of the Information Divergence from an Exponential Family / Das Auffinden der Maximierer der Informationsdivergenz von einer Exponentialfamilie

Rauh, Johannes 19 October 2011 (has links) (PDF)
The subject of this thesis is the maximization of the information divergence from an exponential family on a finite set, a problem first formulated by Nihat Ay. A special case is the maximization of the mutual information or the multiinformation between different parts of a composite system. My thesis contributes mainly to the mathematical aspects of the optimization problem. A reformulation is found that relates the maximization of the information divergence with the maximization of an entropic quantity, defined on the normal space of the exponential family. This reformulation simplifies calculations in concrete cases and gives theoretical insight about the general problem. A second emphasis of the thesis is on examples that demonstrate how the theoretical results can be applied in particular cases. Third, my thesis contain first results on the characterization of exponential families with a small maximum value of the information divergence.
2

Random Multigraphs : Complexity Measures, Probability Models and Statistical Inference

Shafie, Termeh January 2012 (has links)
This thesis is concerned with multigraphs and their complexity which is defined and quantified by the distribution of edge multiplicities. Two random multigraph models are considered.  The first model is random stub matching (RSM) where the edges are formed by randomly coupling pairs of stubs according to a fixed stub multiplicity sequence. The second model is obtained by independent edge assignments (IEA) according to a common probability distribution over the edge sites. Two different methods for obtaining an approximate IEA model from an RSM model are also presented. In Paper I, multigraphs are analyzed with respect to structure and complexity by using entropy and joint information. The main results include formulae for numbers of graphs of different kinds and their complexity. The local and global structure of multigraphs under RSM are analyzed in Paper II. The distribution of multigraphs under RSM is shown to depend on a single complexity statistic. The distributions under RSM and IEA are used for calculations of moments and entropies, and for comparisons by information divergence. The main results include new formulae for local edge probabilities and probability approximation for simplicity of an RSM multigraph. In Paper III, statistical tests of a simple or composite IEA hypothesis are performed using goodness-of-fit measures. The results indicate that even for very small number of edges, the null distributions of the test statistics under IEA have distributions that are  well approximated by their asymptotic χ2-distributions. Paper IV contains the multigraph algorithms that are used for numerical calculations in Papers I-III.
3

Finding the Maximizers of the Information Divergence from an Exponential Family: Finding the Maximizersof the Information Divergencefrom an Exponential Family

Rauh, Johannes 09 January 2011 (has links)
The subject of this thesis is the maximization of the information divergence from an exponential family on a finite set, a problem first formulated by Nihat Ay. A special case is the maximization of the mutual information or the multiinformation between different parts of a composite system. My thesis contributes mainly to the mathematical aspects of the optimization problem. A reformulation is found that relates the maximization of the information divergence with the maximization of an entropic quantity, defined on the normal space of the exponential family. This reformulation simplifies calculations in concrete cases and gives theoretical insight about the general problem. A second emphasis of the thesis is on examples that demonstrate how the theoretical results can be applied in particular cases. Third, my thesis contain first results on the characterization of exponential families with a small maximum value of the information divergence.:1. Introduction 2. Exponential families 2.1. Exponential families, the convex support and the moment map 2.2. The closure of an exponential family 2.3. Algebraic exponential families 2.4. Hierarchical models 3. Maximizing the information divergence from an exponential family 3.1. The directional derivatives of D(*|E ) 3.2. Projection points and kernel distributions 3.3. The function DE 3.4. The first order optimality conditions of DE 3.5. The relation between D(*|E) and DE 3.6. Computing the critical points 3.7. Computing the projection points 4. Examples 4.1. Low-dimensional exponential families 4.1.1. Zero-dimensional exponential families 4.1.2. One-dimensional exponential families 4.1.3. One-dimensional exponential families on four states 4.1.4. Other low-dimensional exponential families 4.2. Partition models 4.3. Exponential families with max D(*|E ) = log(2) 4.4. Binary i.i.d. models and binomial models 5. Applications and Outlook 5.1. Principles of learning, complexity measures and constraints 5.2. Optimally approximating exponential families 5.3. Asymptotic behaviour of the empirical information divergence A. Polytopes and oriented matroids A.1. Polytopes A.2. Oriented matroids Bibliography Index Glossary of notations
4

Experiments with Support Vector Machines and Kernels

Kohram, Mojtaba 21 October 2013 (has links)
No description available.

Page generated in 0.0997 seconds