Return to search

Real-Time Gesture-Based Posture Control of a Manipulator

Reaching a target quickly and accurately with a robotic arm containing multiple joints while avoiding moving and fixed obstacles can be a daunting (and sometimes impossible) task for any user behind the remote control. Current existing solutions are often hard to use and to scale for all user body types and robotic arm configurations. In this work, we propose a vision-based gesture recognition approach to naturally control the overall posture of a robotic arm using human hand gestures and an inverse kinematic exploration approach using the FABRIK algorithm. Three different methods are investigated to intuitively control a robotic arm's posture in real-time using depth data collected by a Kinect sensor. Each of the posture control methods are users scalable and compatible with most existing robotic arm configurations. In the first method, the user's right index fingertip position is mapped to compute the inverse kinematics on the robot. The inverse kinematics solutions are displayed in a graphical interface. Using this interface and the left hand, the user can intuitively browse and select a desired robotic arm posture. In the second method, the user's right index fingertip position and finger direction are respectively used to determine the end-effector position and an attraction point position. The latter enables the control of the robotic arm posture. In the third method, the user's right index finger is mapped to compute the inverse kinematics on the robot. Using static gesture with the same hand, the user's right index finger can be transformed into a virtual pen that can trace the form of the desired robotic arm posture. The trace can be visualized in real-time on a graphical interface. A search is then performed using an inverse kinematic exploration and the Dynamic Time Warping algorithm to select the closest matching possible posture. In the last two proposed methods, different search strategies to optimize the speed and the inverse kinematic exploration coverage are proposed. Using a combination of Greedy Best First search and an efficient selection of input postures based on the FABRIK's algorithm characteristics, these optimizations allow for smoother and more accurate posture control of the robotic arm. The performance of these real-time natural human control approaches is evaluated for precision and speed against static (i.e. fixed) and dynamic (i.e. moving) obstacles in a simulated experiment. An adaptation of the vision-based gesture recognition system to operate the AL5D robotic arm was also implemented to conduct further evaluation in a real-world environment. The results showed that the first and third methods were better suited for obstacle avoidance in static environments not requiring continuous posture changes. The second method gave excellent results in the dynamic environment experience and was able to complete a challenging pick and place task in a difficult real-world environment with static constraints.

Identiferoai:union.ndltd.org:uottawa.ca/oai:ruor.uottawa.ca:10393/40096
Date20 January 2020
CreatorsPlouffe, Guillaume
ContributorsPayeur, Pierre, Cretu, Ana-Maria
PublisherUniversité d'Ottawa / University of Ottawa
Source SetsUniversité d’Ottawa
LanguageEnglish
Detected LanguageEnglish
TypeThesis
Formatapplication/pdf

Page generated in 0.0074 seconds