Unmanned air vehicles (UAVs) are attracting increased attention as their envelope of suitable tasks expands to include activities such as perimeter tracking, search and rescue assistance, surveillance and reconnaissance. The simplified goal of many of these tasks is to image an object for tracking or information-gathering purposes. The ability to determine the inertial location of a visible, ground-based object without requiring a priori knowledge of its exact location would therefore prove beneficial. This thesis discusses a method of localizing a ground-based object when imaged from a fixed-wing UAV. Using the target's pixel location in an image, with measurements of UAV position, attitude and camera pose angles, the target is localized in world coordinates. This thesis also presents a study of possible error sources and localization sensitivities to each source. From this study, an accuracy within 15.5 m of actual target location is expected. Also, several methods of filtering are presented, which allow for effective noise reduction. Finally, filtered hardware results are presented that verify these expectations by localizing a target from a fixed-wing UAV using on-board vision to within 10.9 meters.
Identifer | oai:union.ndltd.org:BGMYU2/oai:scholarsarchive.byu.edu:etd-1557 |
Date | 07 July 2005 |
Creators | Redding, Joshua D. |
Publisher | BYU ScholarsArchive |
Source Sets | Brigham Young University |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Theses and Dissertations |
Rights | http://lib.byu.edu/about/copyright/ |
Page generated in 0.0019 seconds