The recent advent of consumer grade Brain-Computer Interfaces (BCI) provides a new revolutionary and accessible way to control computers. BCI translate cognitive electroencephalography (EEG) signals into computer or robotic commands using specially built headsets. Capable of enhancing traditional interfaces that require interaction with a keyboard, mouse or touchscreen, BCI systems present tremendous opportunities to benefit various fields. Movement restricted users can especially benefit from these interfaces. In this thesis, we present a new way to interface a consumer-grade BCI solution to a mobile robot. A Red-Green-Blue-Depth (RGBD) camera is used to enhance the navigation of the robot with cognitive thoughts as commands. We introduce an interface presenting 3 different methods of robot-control: 1) a fully manual mode, where a cognitive signal is interpreted as a command, 2) a control-flow manual mode, reducing the likelihood of false-positive commands and 3) an automatic mode assisted by a remote RGBD camera. We study the application of this work by navigating the mobile robot on a planar surface using the different control methods while measuring the accuracy and usability of the system. Finally, we assess the newly designed interface’s role in the design of future generation of BCI solutions.
Identifer | oai:union.ndltd.org:uottawa.ca/oai:ruor.uottawa.ca:10393/22896 |
Date | January 2012 |
Creators | Jacques, Maxime |
Contributors | Petriu, Emil |
Publisher | Université d'Ottawa / University of Ottawa |
Source Sets | Université d’Ottawa |
Language | English |
Detected Language | English |
Type | Thesis |
Page generated in 0.0021 seconds