Return to search

Robot-based haptic perception and telepresence for the visually impaired

With the advancements in medicine and welfare systems, the average life span of modern human beings is expanding, creating a new market for elderly care and assistive technology. Along with the development of assistive devices based on traditional aids such as voice-readers, electronic wheelchairs, and prosthetic limbs, a robotic platform is one of the most suitable platforms for providing multi-purpose assistance in human life. This research focuses on the transference of environmental perception to a human user through the use of interactive multi-modal feedback and an assistive robotic platform. A novel framework for haptic telepresence is presented to solve the problem, and state-of-the-art methodologies from computer vision, haptics, and robotics are utilized.

The objective of this research is to design a framework that achieves the following: 1) This framework integrates visual perception from heterogeneous vision sensors, 2) it enables real-time interactive haptic representation of the real world through a mobile manipulation robotic platform and a haptic interface, and 3) it achieves haptic fusion of multiple sensory modalities from the robotic platform and provides interactive feedback to the human user. Specifically, a set of multi-disciplinary algorithms such as stereo-vision processes, three-dimensional (3D) map-building algorithms, and virtual-proxy based haptic volume representation processes will be integrated into a unified framework to successfully accomplish the goal. The application area of this work is focused on, but not limited to, assisting people with visual impairment with a robotic platform by providing multi-modal feedback of the environment.

Identiferoai:union.ndltd.org:GATECH/oai:smartech.gatech.edu:1853/44848
Date28 June 2012
CreatorsPark, Chung Hyuk
PublisherGeorgia Institute of Technology
Source SetsGeorgia Tech Electronic Thesis and Dissertation Archive
Detected LanguageEnglish
TypeDissertation

Page generated in 0.002 seconds