Return to search

An Evaluation of Gaze and EEG-Based Control of a Mobile Robot

Context: Patients with diseases such as locked in syndrome or motor neuron are paralyzed and they need special care. To reduce the cost of their care, systems need to be designed where human involvement is minimal and affected people can perform their daily life activities independently. To assess the feasibility and robustness of combinations of input modalities, mobile robot (Spinosaurus) navigation is controlled by a combination of Eye gaze tracking and other input modalities. Objectives: Our aim is to control the robot using EEG brain signals and eye gaze tracking simultaneously. Different combinations of input modalities are used to control the robot and turret movement and then we find out which combination of control technique mapped to control command is most effective. Methods: The method includes developing the interface and control software. An experiment involving 15 participants was conducted to evaluate control of the mobile robot using a combination of eye tracker and other input modalities. Subjects were required to drive the mobile robot from a starting point to a goal along a pre-defined path. At the end of experiment, a sense of presence questionnaire was distributed among the participants to take their feedback. A qualitative pilot study was performed to find out how a low cost commercial EEG headset, the Emotiv EPOCTM, can be used for motion control of a mobile robot at the end. Results: Our study results showed that the Mouse/Keyboard combination was the most effective for controlling the robot motion and turret mounted camera respectively. In experimental evaluation, the Keyboard/Eye Tracker combination improved the performance by 9%. 86% of participants found that turret mounted camera was useful and provided great assistance in robot navigation. Our qualitative pilot study of the Emotiv EPOCTM demonstrated different ways to train the headset for different actions. Conclusions: In this study, we concluded that different combinations of control techniques could be used to control the devices e.g. a mobile robot or a powered wheelchair. Gaze-based control was found to be comparable with the use of a mouse and keyboard; EEG-based control was found to need a lot of training time and was difficult to train. Our pilot study suggested that using facial expressions to train the Emotiv EPOCTM was an efficient and effective way to train it.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:bth-4625
Date January 2011
CreatorsKhan, Mubasher Hassan, Laique, Tayyab
PublisherBlekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0018 seconds