Approved for public release; distribution is unlimited / The role of Unmanned Aerial Vehicles (UAV) for modern military operations is expected to expand in the 21st Century, including increased deployment of UAVs from Navy ships at sea. Autonomous operation of UAVs from ships at sea requires the UAV to land on a moving ship using only passive sensors installed in the UAV. This thesis investigates the feasibility of using passive vision sensors installed in the UAV to estimate the UAV position relative to the moving platform. A navigation algorithm based on photogrammetry and perspective estimation is presented for numerically determining the relative position and orientation of an aircraft with respect to a ship that possesses three visibly significant points with known separation distances. Original image processing algorithms that reliably locate visually significant features in monochrome images are developed. Monochrome video imagery collected during flight test with an infrared video camera mounted in the nose of a UAV during actual landing approaches is presented. The navigation and image processing algorithms are combined to reduce the flight test images into vehicle position estimates. These position estimates are compared to truth data to demonstrate the feasibility of passive, vision-based sensors for aircraft navigation. Conclusions are drawn, and recommendations for further study are presented
Identifer | oai:union.ndltd.org:nps.edu/oai:calhoun.nps.edu:10945/7829 |
Date | 09 1900 |
Creators | Ghyzel, Paul A. |
Contributors | Kaminer, Isaac I., Yakimenko, Oleg A. |
Publisher | Monterey, California. Naval Postgraduate School |
Source Sets | Naval Postgraduate School |
Language | en_US |
Detected Language | English |
Type | Thesis |
Page generated in 0.0014 seconds