As the size of an adaptive antenna array grows, the system is able to resist interference signals of increasing bandwidth. This is a result of the transmit pattern gain increasing, which raises the target's return power, and a greater number of degrees of freedom. However, once the interference signal decorrelates completely from one channel to the next, increasing array size will cease to improve detection capability. The use of tapped delay-line processing to improve correlation between channels has been studied for smaller arrays with single element antennas, but previous analyses have not considereded larger systems that are partitioned into subarrays.
This thesis quantifies the effect that subarrays have on performance, as measured by the interference bandwidth that can be handled, and explains how tapped delay-line processing can maintain the ability to detect targets in an environment with high bandwidth interference. The analysis begins by deriving equations to estimate the half-power bandwidth of an array with no taps. Then we find that a single delay with optimal spacing is sufficient to completely restore performance if the interference angle is known exactly. However, in practice, the tap spacing will never be optimal because this angle will not be known exactly, so further consideration is given to this non-ideal case and possible solutions for arbitrary interference scenarios are presented. Simulations indicate that systems with multiple taps have more tolerance to increasing interference bandwidth and unknown directions of arrival. Finally, the tradeoffs between ideal and practical configurations are explained and suggestions are given for the design of real-world systems.
Identifer | oai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/14468 |
Date | 09 April 2007 |
Creators | Wortham, Cody |
Publisher | Georgia Institute of Technology |
Source Sets | Georgia Tech Electronic Thesis and Dissertation Archive |
Detected Language | English |
Type | Thesis |
Page generated in 0.0073 seconds