A formula for the average fade time of the intensity of a plane optical wave traveling through atmospheric turbulence is developed. The model employed involves isotropic, homogeneous statistics using a lognormal distribution for the channel. The analysis is based on the fact that the logarithm of the irradiance is normally distributed and uses the work of S. O. Rice who developed such an expression for a zero mean, Gaussian process. The analysis employs the covariance function and the Taylor frozen turbulence hypothesis which results in an expression for the autocorrelation function.
Identifer | oai:union.ndltd.org:ucf.edu/oai:stars.library.ucf.edu:rtd-1506 |
Date | 01 January 1980 |
Creators | O'Hara, John F. |
Publisher | STARS |
Source Sets | University of Central Florida |
Language | English |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Retrospective Theses and Dissertations |
Rights | Public Domain |
Page generated in 0.002 seconds