This report covers an investigation of the methods and algorithms required to plan and perform semi-autonomous photo missions on Apple iPad devices using data exported from Google Earth. Flight time was to be minimized, taking wind velocity and aircraft performance into account. Google Earth was used both to define what photos to take, and to define the allowable mission area for the aircraft. A benchmark mission was created containing 30 photo operations in a 250 by 500 m area containing several no-fly-areas. The report demonstrates that photos taken in Google Earth can be reproduced in reality with good visual resemblance. High quality paths between all possible photo operation pairs in the benchmark mission could be found in seconds using the Theta* algorithm in a 3D grid representation with six-edge connectivity (Up, Down, North, South, East, West). Smoothing the path in a post-processing step was shown to further increase the quality of the path at a very low computational cost. An optimal route between the operations in the benchmark mission, using the paths found by Theta*, could be found in less than half a minute using a Branch-and-Bound algorithm. It was however also found that prematurely terminating the algorithm after five seconds yielded a route that was close enough to optimal not to warrant running the algorithm to completion.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:miun-31473 |
Date | January 2017 |
Creators | Nilsson, Per Johan Fredrik |
Publisher | Mittuniversitetet, Avdelningen för data- och systemvetenskap |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Student thesis, info:eu-repo/semantics/bachelorThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Page generated in 0.0026 seconds