Return to search

Control of Articulated Robot Arm by Eye Tracking

Eye tracking has many comprehensive achievements in the field of human computer interaction. Uses of human eyes as an alternative of hands are an innovative way in the human computer interaction perspective. Many application of autonomous robot control has already been developed, but we developed two different interfaces to control the articulated robot manually. The first of these interfaces is controlled by mouse and the second is controlled by eye tracking. Main focus of our thesis is to facilitate the people with motor disabilities by using their eye as an input instead of a mouse. Eye gaze tracking technique is used to send commands to perform different tasks. Interfaces are divided into different active and inactive regions. Dwell time is a well known technique which is used to execute commands through eye gaze instead of using a mouse. When a user gazes in an active region for a specific dwell time, the command is executed and the robot performs a specific task. When inactive regions are gazed at, there no command execution and no function are performed. The difference between time of performing the task by mouse and Eyetracking is shown to be 40 ms, the mouse being faster. However, a mouse cannot be used for people with motor disabilities, so the Eyetracker in this case has a decisive advantage. Keywords: Eyetracking, Interface, Articulated robot

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:bth-3096
Date January 2010
CreatorsShahzad, Muhammad Imran, Mehmood, Saqib
PublisherBlekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0024 seconds