Return to search

Robot Localization Obtained by Using Inertial Measurements, Computer Vision, and Wireless Ranging

Robots have long been used for completing tasks that are too difficult, dangerous, or distant to be accomplished by humans. In many cases, these robots are highly specialized platforms - often expensive and capable of completing every task related to a mission's objective. An alternative approach is to use multiple platforms, each less capable in terms of number of tasks and thus significantly less complex and less costly. With advancements in embedded computing and wireless communications, multiple such platforms have been shown to work together to accomplish mission objectives. In the extreme, collections of very simple robots have demonstrated emergent behavior akin to that seen in nature (e.g., bee colonies) motivating the moniker of ''swarm robotics'' - a group of robots working collaboratively to accomplish a task. The use of robotic swarms offers the potential to solve complex tasks more efficiently than a single robot by introducing robustness and flexibility to the system.
This work investigates localization in heterogeneous and autonomous robotic swarms to improve their ability to carry out exploratory missions in unknown terrain. Collaboratively, these robots can, for example, conduct sensing and mapping of an environment while simultaneously evolving a communication network. For this application, among many others, it is required to determine an accurate knowledge of the robot's pose (i.e., position and orientation). The act of determining the pose of the robot is known as localization. Some low cost robots can provide location estimates using inertial measurements (i.e., odometry), however this method alone is insufficient due to cumulative errors in sensing. Image tracking and wireless localization methods are implemented in this work to increase the accuracy of localization estimates. These localization methods complement each other: image tracking yields higher accuracy than wireless, however a line-of-sight (LOS) with the target is required; wireless localization can operate under LOS or non-LOS conditions, however has issues in multipath conditions. Together, these methods can be used to improve localization results under all sight conditions. The specific contributions of this work are: (1) a concept of 'shared sensing' in which extremely simple and inexpensive robots with unreliable localization estimates are used in a heterogeneous swarm of robots in a way that increases the accuracy of localization for the simple agents and simultaneously extends the sensing capabilities of the more complex robots, (2) a description, evaluation, and discussion of various means to estimate a robot's pose, (3) a method for increasing reliability of RSSI measurements for wireless ranging/localization systems by averaging RSSI measurements over both time and space, (4) a process for developing an in-field model to be used for estimating the location of a robot by leveraging the existing wireless communication system.

Identiferoai:union.ndltd.org:uvm.edu/oai:scholarworks.uvm.edu:graddis-1393
Date01 January 2015
CreatorsBaker, William
PublisherScholarWorks @ UVM
Source SetsUniversity of Vermont
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceGraduate College Dissertations and Theses

Page generated in 0.0024 seconds