Spelling suggestions: "subject:"renyi divergence"" "subject:"enyi divergence""
1 |
On Renyi Divergence Measures for Continuous Alphabet SourcesGIL, MANUEL 30 August 2011 (has links)
The idea of `probabilistic distances' (also called divergences), which in some sense assess how `close' two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Renyi divergence measures. While the closely related concept of Renyi entropy of a probability distribution has been studied extensively, and closed-form expressions for the most common univariate and multivariate continuous distributions have been obtained and compiled, the literature currently lacks the corresponding compilation for continuous Renyi divergences. The present thesis addresses this issue for analytically tractable cases. Closed-form expressions for Kullback-Leibler divergences are also derived and compiled, as they can be seen as an extension by continuity of the Renyi divergences. Additionally, we establish a connection between Renyi divergence and the variance of the log-likelihood ratio of two distributions, which extends the work of Song (2001) on the relation between Renyi entropy and the log-likelihood function, and which becomes practically useful in light of the Renyi divergence expressions we have derived. Lastly, we consider the Renyi divergence rate between two zero-mean stationary Gaussian processes. / Thesis (Master, Mathematics & Statistics) -- Queen's University, 2011-08-30 13:37:41.792
|
Page generated in 0.067 seconds