The thesis proposes a 3D navigation and planning system for an autonomous remotely controlled quadcopter (drone). The solution uses the drone sensor data along with the data processed from the video camera image stream, without having any knowledge of its surroundings beforehand and without using any nav- igation signal (GPS). The video camera data are transformed into a sparse point- cloud representation, from it is created an occupancy map of the surrounding area with adaptive cell size. The planner can construct trajectory plans in the map, respecting the detected obstacles. The planned trajectory is executed by a simple drone controller. The proposed system includes a simulator which enables virtual execution of the whole process. The thesis composes originally independent and incompatible sub- systems into a single compactly working system. The functionality of the system is demonstrated on a few simple scenarios, one of which is the return of the drone to its starting location.
Identifer | oai:union.ndltd.org:nusl.cz/oai:invenio.nusl.cz:365189 |
Date | January 2017 |
Creators | Harasim, Jiří |
Contributors | Barták, Roman, Obdržálek, David |
Source Sets | Czech ETDs |
Language | English |
Detected Language | English |
Type | info:eu-repo/semantics/masterThesis |
Rights | info:eu-repo/semantics/restrictedAccess |
Page generated in 0.0031 seconds