Return to search

Statistical Fading of a Plane Optical Wave in Atmospheric Turbulence

A formula for the average fade time of the intensity of a plane optical wave traveling through atmospheric turbulence is developed. The model employed involves isotropic, homogeneous statistics using a lognormal distribution for the channel. The analysis is based on the fact that the logarithm of the irradiance is normally distributed and uses the work of S. O. Rice who developed such an expression for a zero mean, Gaussian process. The analysis employs the covariance function and the Taylor frozen turbulence hypothesis which results in an expression for the autocorrelation function.

Identiferoai:union.ndltd.org:ucf.edu/oai:stars.library.ucf.edu:rtd-1506
Date01 January 1980
CreatorsO'Hara, John F.
PublisherSTARS
Source SetsUniversity of Central Florida
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceRetrospective Theses and Dissertations
RightsPublic Domain

Page generated in 0.1078 seconds