Return to search

Statistical Fading of a Spherical Optical Wave in Atmospheric Turbulence

A formula for the average fade time of the intensity of a spherical optical wave travelling through atmospheric turbulence is developed. The model employed involves isotropic, homogeneous statistics using a lognormal distribution for the channel. The analysis is based on the fact that the logarithm of the irradiance is normally distributed and uses the work of S. O. Rice who developed such an expression for a zero mean, Gaussian process. The analysis employs the covariance function and the Taylor frozen turbulence hypothesis which results in an expression for the autocorrelation function.

Identiferoai:union.ndltd.org:ucf.edu/oai:stars.library.ucf.edu:rtd-1499
Date01 April 1980
CreatorsLocke, Lorraine M.
PublisherUniversity of Central Florida
Source SetsUniversity of Central Florida
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceRetrospective Theses and Dissertations
RightsPublic Domain

Page generated in 0.0134 seconds