Unmanned Aerial Vehicles (UAVs) are increasingly becoming economical platforms for carrying a variety of sensors. Building flight plans that place sensors properly, temporally and spatially, is difficult. The goal of sensor-driven planning is to automatically generate flight plans based on desired sensor placement and temporal constraints. We propose a simple taxonomy of UAV-enabled sensors, identify a set of generic sensor tasks, and argue that many real-world tasks can be represented by the taxonomy. We present a hierarchical sensor-driven flight planning system capable of generating 2D flights that satisfy desired sensor placement and complex timing and dependency constraints. The system makes use of several well-known planning algorithms and includes a user interface. We conducted a user study to show that sensor-driven planning can be used by non-experts, that it is easier for non-experts than traditional waypoint-based planning, and that it produces better flights than waypoint-based planning. The results of our user study experiment support the claims that sensor-driven planning is usable and that it produces better flights.
Identifer | oai:union.ndltd.org:BGMYU2/oai:scholarsarchive.byu.edu:etd-4797 |
Date | 23 September 2013 |
Creators | Clark, Spencer James |
Publisher | BYU ScholarsArchive |
Source Sets | Brigham Young University |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | Theses and Dissertations |
Rights | http://lib.byu.edu/about/copyright/ |
Page generated in 0.0019 seconds