11 |
Canonical Correlation and the Calculation of Information Measures for Infinite-Dimensional Distributions: Kanonische Korrelationen und die Berechnung von Informationsmaßen für unendlichdimensionale VerteilungenHuffmann, Jonathan 26 March 2021 (has links)
This thesis investigates the extension of the well-known canonical correlation analysis for random elements on abstract real measurable Hilbert spaces. One focus is on the application of this extension to the calculation of information-theoretical quantities on finite time intervals. Analytical approaches for the calculation of the mutual information and the information density between Gaussian distributed random elements on arbitrary real measurable Hilbert spaces are derived.
With respect to mutual information, the results obtained are comparable to [4] and [1] (Baker, 1970, 1978). They can also be seen as a generalization of earlier findings in [20] (Gelfand and Yaglom, 1958). In addition, some of the derived equations for calculating the information density, its characteristic function and its n-th central moments extend results from [45] and [44] (Pinsker, 1963, 1964).
Furthermore, explicit examples for the calculation of the mutual information, the characteristic function of the information density as well as the n-th central moments of the information density for the important special case of an additive Gaussian channel with Gaussian distributed input signal with rational spectral density are elaborated, on the one hand for white Gaussian noise and on the other hand for Gaussian noise with rational spectral density. These results extend the corresponding concrete examples for the calculation of the mutual information from [20] (Gelfand and Yaglom, 1958) as well as [28] and [29] (Huang and Johnson, 1963, 1962).:Kurzfassung
Abstract
Notations
Abbreviations
1 Introduction
1.1 Software Used
2 Mathematical Background
2.1 Basic Notions of Measure and Probability Theory
2.1.1 Characteristic Functions
2.2 Stochastic Processes
2.2.1 The Consistency Theorem of Daniell and Kolmogorov
2.2.2 Second Order Random Processes
2.3 Some Properties of Fourier Transforms
2.4 Some Basic Inequalities
2.5 Some Fundamentals in Functional Analysis
2.5.1 Hilbert Spaces
2.5.2 Linear Operators on Hilbert Spaces
2.5.3 The Fréchet-Riesz Representation Theorem
2.5.4 Adjoint and Compact Operators
2.5.5 The Spectral Theorem for Compact Operators
3 Mutual Information and Information Density
3.1 Mutual Information
3.2 Information Density
4 Probability Measures on Hilbert Spaces
4.1 Measurable Hilbert Spaces
4.2 The Characteristic Functional
4.3 Mean Value and Covariance Operator
4.4 Gaussian Probability Measures on Hilbert Spaces
4.5 The Product of Two Measurable Hilbert Spaces
4.5.1 The Product Measure
4.5.2 Cross-Covariance Operator
5 Canonical Correlation Analysis on Hilbert Spaces
5.1 The Hellinger Distance and the Theorem of Kakutani
5.2 Canonical Correlation Analysis on Hilbert Spaces
5.3 The Theorem of Hájek and Feldman
6 Mutual Information and Information Density Between Gaussian Measures
6.1 A General Formula for Mutual Information and Information Density for Gaussian Random Elements
6.2 Hadamard’s Factorization Theorem
6.3 Closed Form Expressions for Mutual Information and Related Quantities
6.4 The Discrete-Time Case
6.5 The Continuous-Time Case
6.6 Approximation Error
7 Additive Gaussian Channels
7.1 Abstract Channel Model and General Definitions
7.2 Explicit Expressions for Mutual Information and Related Quantities
7.2.1 Gaussian Random Elements as Input to an Additive Gaussian Channel
8 Continuous-Time Gaussian Channels
8.1 White Gaussian Channels
8.1.1 Two Simple Examples
8.1.2 Gaussian Input with Rational Spectral Density
8.1.3 A Method of Youla, Kadota and Slepian
8.2 Noise and Input Signal with Rational Spectral Density
8.2.1 Again a Method by Slepian and Kadota
Bibliography / Diese Arbeit untersucht die Erweiterung der bekannten kanonischen Korrelationsanalyse (canonical correlation analysis) für Zufallselemente auf abstrakten reellen messbaren Hilberträumen. Ein Schwerpunkt liegt dabei auf der Anwendung dieser Erweiterung zur Berechnung informationstheoretischer Größen auf endlichen Zeitintervallen. Analytische Ansätze für die Berechnung der Transinformation und der Informationsdichte zwischen gaußverteilten Zufallselementen auf beliebigen reelen messbaren Hilberträumen werden hergeleitet.
Bezüglich der Transinformation sind die gewonnenen Resultate vergleichbar zu [4] und [1] (Baker, 1970, 1978). Sie können auch als Verallgemeinerung früherer Erkenntnisse aus [20] (Gelfand und Yaglom, 1958) aufgefasst werden. Zusätzlich erweitern einige der hergeleiteten Formeln zur Berechnung der Informationsdichte, ihrer charakteristischen Funktion und ihrer n-ten zentralen Momente Ergebnisse aus [45] und [44] (Pinsker, 1963, 1964).
Weiterhin werden explizite Beispiele für die Berechnung der Transinformation, der charakteristischen Funktion der Informationsdichte sowie der n-ten zentralen Momente der Informationsdichte für den wichtigen Spezialfall eines additiven Gaußkanals mit gaußverteiltem Eingangssignal mit rationaler Spektraldichte erarbeitet, einerseits für gaußsches weißes Rauschen und andererseits für gaußsches Rauschen mit einer rationalen Spektraldichte. Diese Ergebnisse erweitern die entsprechenden konkreten Beispiele zur Berechnung der Transinformation aus [20] (Gelfand und Yaglom, 1958) sowie [28] und [29] (Huang und Johnson, 1963, 1962).:Kurzfassung
Abstract
Notations
Abbreviations
1 Introduction
1.1 Software Used
2 Mathematical Background
2.1 Basic Notions of Measure and Probability Theory
2.1.1 Characteristic Functions
2.2 Stochastic Processes
2.2.1 The Consistency Theorem of Daniell and Kolmogorov
2.2.2 Second Order Random Processes
2.3 Some Properties of Fourier Transforms
2.4 Some Basic Inequalities
2.5 Some Fundamentals in Functional Analysis
2.5.1 Hilbert Spaces
2.5.2 Linear Operators on Hilbert Spaces
2.5.3 The Fréchet-Riesz Representation Theorem
2.5.4 Adjoint and Compact Operators
2.5.5 The Spectral Theorem for Compact Operators
3 Mutual Information and Information Density
3.1 Mutual Information
3.2 Information Density
4 Probability Measures on Hilbert Spaces
4.1 Measurable Hilbert Spaces
4.2 The Characteristic Functional
4.3 Mean Value and Covariance Operator
4.4 Gaussian Probability Measures on Hilbert Spaces
4.5 The Product of Two Measurable Hilbert Spaces
4.5.1 The Product Measure
4.5.2 Cross-Covariance Operator
5 Canonical Correlation Analysis on Hilbert Spaces
5.1 The Hellinger Distance and the Theorem of Kakutani
5.2 Canonical Correlation Analysis on Hilbert Spaces
5.3 The Theorem of Hájek and Feldman
6 Mutual Information and Information Density Between Gaussian Measures
6.1 A General Formula for Mutual Information and Information Density for Gaussian Random Elements
6.2 Hadamard’s Factorization Theorem
6.3 Closed Form Expressions for Mutual Information and Related Quantities
6.4 The Discrete-Time Case
6.5 The Continuous-Time Case
6.6 Approximation Error
7 Additive Gaussian Channels
7.1 Abstract Channel Model and General Definitions
7.2 Explicit Expressions for Mutual Information and Related Quantities
7.2.1 Gaussian Random Elements as Input to an Additive Gaussian Channel
8 Continuous-Time Gaussian Channels
8.1 White Gaussian Channels
8.1.1 Two Simple Examples
8.1.2 Gaussian Input with Rational Spectral Density
8.1.3 A Method of Youla, Kadota and Slepian
8.2 Noise and Input Signal with Rational Spectral Density
8.2.1 Again a Method by Slepian and Kadota
Bibliography
|
12 |
The sounds of safety: stress and danger in music perceptionSchäfer, Thomas, Huron, David, Shanahan, Daniel, Sedlmeier, Peter 27 August 2015 (has links)
As with any sensory input, music might be expected to incorporate the processing of information about the safety of the environment. Little research has been done on how such processing has evolved and how different kinds of sounds may affect the experience of certain environments. In this article, we investigate if music, as a form of auditory information, can trigger the experience of safety. We hypothesized that (1) there should be an optimal, subjectively preferred degree of information density of musical sounds, at which safety-related information can be processed optimally; (2) any deviation from the optimum, that is, both higher and lower levels of information density, should elicit experiences of higher stress and danger; and (3) in general, sonic scenarios with music should reduce experiences of stress and danger more than other scenarios. In Experiment 1, the information density of short music-like rhythmic stimuli was manipulated via their tempo. In an initial session, listeners adjusted the tempo of the stimuli to what they deemed an appropriate tempo. In an ensuing session, the same listeners judged their experienced stress and danger in response to the same stimuli, as well as stimuli exhibiting tempo variants. Results are consistent with the existence of an optimum information density for a given rhythm; the preferred tempo decreased for increasingly complex rhythms. The hypothesis that any deviation from the optimum would lead to experiences of higher stress and danger was only partly fit by the data. In Experiment 2, listeners should indicate their experience of stress and danger in response to different sonic scenarios: music, natural sounds, and silence. As expected, the music scenarios were associated with lowest stress and danger whereas both natural sounds and silence resulted in higher stress and danger. Overall, the results largely fit the hypothesis that music seemingly carries safety-related information about the environment.
|
Page generated in 0.1418 seconds