Return to search

Graph-based world-model for robotic manipulation

There has been a significant push in robotics research toward robot autonomy. However, full autonomy is currently impractical for all but the most clearly defined tasks in the most structured environments. However, as tasks become less defined and environments become cluttered and less controlled, there is still a benefit to implementing semi-autonomous behaviors where aspects of the tasks are completed autonomously thus reducing the burden on the human operator. A key component of a robot control system that supports this functionality is a robust world model to act as a repository of environmental information.
The research community has provided many world-modeling solutions to support autonomous vehicle navigation. As such, they focus primarily on preventing collisions with the environment. Modeling schemes designed for collision prevention are of limited use to robotic manipulators that must have contact interaction with the environment as a matter of course.
This thesis presents a world-modeling scheme that abstracts the model of the environment into a graph structure. This abstraction separates the concepts of entities in the environment from their relationships to the environment. The result is an intuitive world model that supports not only collision detection, but also motion planning and grasping. The graph-based world model presented can be searched by semantic type and tag values, allowing any number of agents to simultaneously use and update the model without causing failures elsewhere in the system. These capabilities are demonstrated on two different automated hot-cell glovebox systems, and one mobile manipulation system for use in remote contamination testing. / text

Identiferoai:union.ndltd.org:UTEXAS/oai:repositories.lib.utexas.edu:2152/ETD-UT-2010-08-1734
Date23 December 2010
CreatorsO'Neil, Brian Erick, 1978-
Source SetsUniversity of Texas
LanguageEnglish
Detected LanguageEnglish
Typethesis
Formatapplication/pdf

Page generated in 0.0014 seconds