Return to search

On the discontinuity of the Shannon information measures and typical sequences. / CUHK electronic theses & dissertations collection

As entropy is also an important quantity in physics, we relate our results to physical processes by demonstrating that the discontinuity of entropy can occur in a Markov chain. We also discuss the possible implications of this phenomenon in thermodynamics and cosmology. / For two probability distributions with finite alphabets, some bounds on the difference of their entropies as a function of their alphabet sizes and variational distance are obtained. These bounds, which are tighter than some existing results, show that entropy estimation by finite alphabet approximation may not work as we expected. On the other hand, we show that there always exists a finite alphabet approximation that works for entropy estimation provided that the alphabet used is sufficiently large. Some necessary and sufficient conditions under which the entropy of a sequence of probability distributions converges are given in terms of four new information divergence measures, where the square root of two of them are metrics. / In information theory, weak typicality and strong typicality are essential tools for proving coding theorems. Strong typicality, which is more powerful than weak typicality, can be applied to finite alphabet only, while weak typicality can be applied to both finite and countably infinite alphabets. We introduce a unified typicality for finite or countably infinite alphabet which is stronger than both weak typicality and strong typicality. With this unified typicality, the asymptotic equipartition property and the structural properties of strong typicality are preserved. / The Shannon information measures are well known to be continuous functions of the probability distribution for finite alphabet. In this thesis, however, we show that these measures are discontinuous with respect to almost all commonly used "distance" measures when the alphabet is countably infinite. Such "distance" measures include the Kullback-Leibler divergence and the variational distance. Specifically, we show that all the Shannon information measures are in fact discontinuous at all probability distributions. / Ho Siu Wai. / "August 2006." / Adviser: Wai Ho Raymond Yeung. / Source: Dissertation Abstracts International, Volume: 68-03, Section: B, page: 1824. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2006. / Includes bibliographical references (p. 121-123). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. [Ann Arbor, MI] : ProQuest Information and Learning, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.

Identiferoai:union.ndltd.org:cuhk.edu.hk/oai:cuhk-dr:cuhk_343919
Date January 2006
ContributorsHo, Siu Wai., Chinese University of Hong Kong Graduate School. Division of Information Engineering.
Source SetsThe Chinese University of Hong Kong
LanguageEnglish, Chinese
Detected LanguageEnglish
TypeText, theses
Formatelectronic resource, microform, microfiche, 1 online resource (x, 123 p. : ill.)
RightsUse of this resource is governed by the terms and conditions of the Creative Commons “Attribution-NonCommercial-NoDerivatives 4.0 International” License (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Page generated in 0.0016 seconds