A formula for the average fade time of the intensity of a spherical optical wave travelling through atmospheric turbulence is developed. The model employed involves isotropic, homogeneous statistics using a lognormal distribution for the channel. The analysis is based on the fact that the logarithm of the irradiance is normally distributed and uses the work of S. O. Rice who developed such an expression for a zero mean, Gaussian process. The analysis employs the covariance function and the Taylor frozen turbulence hypothesis which results in an expression for the autocorrelation function.
Identifer | oai:union.ndltd.org:ucf.edu/oai:stars.library.ucf.edu:rtd-1499 |
Date | 01 April 1980 |
Creators | Locke, Lorraine M. |
Publisher | University of Central Florida |
Source Sets | University of Central Florida |
Language | English |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Retrospective Theses and Dissertations |
Rights | Public Domain |
Page generated in 0.0018 seconds