This dissertation studies several aspects of the interface between information theory and signal processing. Several new and existing results in information theory are researched from the perspective of signal processing. Similarly, some fundamental results in signal processing and statistics are studied from the information theoretic viewpoint.
The first part of this dissertation focuses on illustrating the equivalence between Stein's identity and De Bruijn's identity, and providing two extensions of De Bruijn's identity. First, it is shown that Stein's identity is equivalent to De Bruijn's identity in additive noise channels with specific conditions. Second, for arbitrary but fixed input and noise distributions, and an additive noise channel model, the first derivative of the differential entropy is expressed as a function of the posterior mean, and the second derivative of the differential entropy is expressed in terms of a function of Fisher information. Several applications over a number of fields, such as statistical estimation theory, signal processing and information theory, are presented to support the usefulness of the results developed in Section 2.
The second part of this dissertation focuses on three contributions. First, a connection between the result, proposed by Stoica and Babu, and the recent information theoretic results, the worst additive noise lemma and the isoperimetric inequality for entropies, is illustrated. Second, information theoretic and estimation theoretic justifications for the fact that the Gaussian assumption leads to the largest Cramer-Rao lower bound (CRLB) is presented. Third, a slight extension of this result to the more general framework of correlated observations is shown.
The third part of this dissertation concentrates on deriving an alternative proof for an extremal entropy inequality (EEI), originally proposed by Liu and Viswanath. Compared with the proofs, presented by Liu and Viswanath, the proposed alternative proof is simpler, more direct, and more information-theoretic. An additional application for the extremal inequality is also provided. Moreover, this section illustrates not only the usefulness of the EEI but also a novel method to approach applications such as the capacity of the vector Gaussian broadcast channel, the lower bound of the achievable rate for distributed source coding with a single quadratic distortion constraint, and the secrecy capacity of the Gaussian wire-tap channel.
Finally, a unifying variational and novel approach for proving fundamental information theoretic inequalities is proposed. Fundamental information theory results such as the maximization of differential entropy, minimization of Fisher information (Cramer-Rao inequality), worst additive noise lemma, entropy power inequality (EPI), and EEI are interpreted as functional problems and proved within the framework of calculus of variations. Several extensions and applications of the proposed results are briefly mentioned.
Identifer | oai:union.ndltd.org:tamu.edu/oai:repository.tamu.edu:1969.1/148120 |
Date | 14 March 2013 |
Creators | Park, Sang Woo |
Contributors | Serpedin, Erchin, Qaraqe, Khalid |
Source Sets | Texas A and M University |
Detected Language | English |
Type | Thesis, text |
Format | application/pdf |
Page generated in 0.0027 seconds