Return to search

Vision-Based Precision Landings of a Tailsitter UAV

We present a method of performing precision landings of a vertical take-off and landing (VTOL) unmanned air vehicle (UAV) with the use of an onboard vision sensor and information about the aircraft's orientation and altitude above ground level (AGL). A method for calculating the 3-dimensional location of the UAV relative to a ground target of interest is presented as well as a navigational controller to position the UAV above the target. A method is also presented to prevent the UAV from moving in a way that will cause the ground target of interest to go out of view of the UAV's onboard camera. These methods are tested in simulation and in hardware and resulting data is shown. Hardware flight testing yielded an average position estimation error of 22 centimeters. The method presented is capable of performing precision landings of VTOL UAV's with submeter accuracy.

Identiferoai:union.ndltd.org:BGMYU2/oai:scholarsarchive.byu.edu:etd-2960
Date24 November 2009
CreatorsMillet, Paul Travis
PublisherBYU ScholarsArchive
Source SetsBrigham Young University
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceTheses and Dissertations
Rightshttp://lib.byu.edu/about/copyright/

Page generated in 0.3401 seconds