• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Gaussian Robust Sequential and Predictive Coding

Song, Lin 10 1900 (has links)
<p>Video coding schemes designed based on sequential or predictive coding models are vulnerable to the loss of encoded frames at the decoder end. Motivated by this observation, in this thesis we propose two new coding models: robust sequential coding and robust predictive coding. For the Gauss-Markov source with the mean squared error distortion measure, we characterize certain supporting hyperplanes of the rate region of these two coding problems. The proof is divided into three steps: 1) it is shown that each supporting hyperplane of the rate region of Gaussian robust sequential coding admits a max-min lower bound; 2) the corresponding min-max upper bound is shown to be achievable by a robust predictive coding scheme; 3) a saddle point analysis proves that the max-min lower bound coincides with the min-max upper bound. Furthermore, it is shown that the proposed robust predictive coding scheme can be implemented using a successive quantization system. Theoretical and experimental results indicate that this scheme has a desirable \self-recovery" property. Our investigation also reveals an information-theoretic minimax theorem and the associated extremal inequalities.</p> / Doctor of Philosophy (PhD)
2

Aspects of Interface between Information Theory and Signal Processing with Applications to Wireless Communications

Park, Sang Woo 14 March 2013 (has links)
This dissertation studies several aspects of the interface between information theory and signal processing. Several new and existing results in information theory are researched from the perspective of signal processing. Similarly, some fundamental results in signal processing and statistics are studied from the information theoretic viewpoint. The first part of this dissertation focuses on illustrating the equivalence between Stein's identity and De Bruijn's identity, and providing two extensions of De Bruijn's identity. First, it is shown that Stein's identity is equivalent to De Bruijn's identity in additive noise channels with specific conditions. Second, for arbitrary but fixed input and noise distributions, and an additive noise channel model, the first derivative of the differential entropy is expressed as a function of the posterior mean, and the second derivative of the differential entropy is expressed in terms of a function of Fisher information. Several applications over a number of fields, such as statistical estimation theory, signal processing and information theory, are presented to support the usefulness of the results developed in Section 2. The second part of this dissertation focuses on three contributions. First, a connection between the result, proposed by Stoica and Babu, and the recent information theoretic results, the worst additive noise lemma and the isoperimetric inequality for entropies, is illustrated. Second, information theoretic and estimation theoretic justifications for the fact that the Gaussian assumption leads to the largest Cramer-Rao lower bound (CRLB) is presented. Third, a slight extension of this result to the more general framework of correlated observations is shown. The third part of this dissertation concentrates on deriving an alternative proof for an extremal entropy inequality (EEI), originally proposed by Liu and Viswanath. Compared with the proofs, presented by Liu and Viswanath, the proposed alternative proof is simpler, more direct, and more information-theoretic. An additional application for the extremal inequality is also provided. Moreover, this section illustrates not only the usefulness of the EEI but also a novel method to approach applications such as the capacity of the vector Gaussian broadcast channel, the lower bound of the achievable rate for distributed source coding with a single quadratic distortion constraint, and the secrecy capacity of the Gaussian wire-tap channel. Finally, a unifying variational and novel approach for proving fundamental information theoretic inequalities is proposed. Fundamental information theory results such as the maximization of differential entropy, minimization of Fisher information (Cramer-Rao inequality), worst additive noise lemma, entropy power inequality (EPI), and EEI are interpreted as functional problems and proved within the framework of calculus of variations. Several extensions and applications of the proposed results are briefly mentioned.

Page generated in 0.0681 seconds