Return to search

<b>Collaborative Human and Computer Controls of Smart Machines</b>

<p dir="ltr">A Human-Machine Interaction (HMI) refers to a mechanism to support the direct interactions of humans and machines with the objective for the synthesis of machine intelligence and autonomy. The demand to advance in this field of study for intelligence controls is continuously growing. Brain-Computer Interface (BCI) is one type of HMIs that utilizes a human brain to enable direct communication of the human subject with a machine. This technology is widely explored in different fields to control external devices using brain signals.</p><p dir="ltr">This thesis is driven by two key observations. The first one is the limited number of Degrees of Freedom (DoF) that existing BCI controls can control in an external device; it becomes necessary to assess the controllability when choosing a control instrument. The second one is the differences of decision spaces of human and machine when both of them try to control an external device. To fill the gaps in these two aspects, there is a need to design an additional functional module that is able to translate the commands issued by human into high-frequency control commands that can be understood by machines. These two aspects has not been investigated thoroughly in literatures.</p><p dir="ltr">This study focuses on training, detecting, and using humans’ intents to control intelligent machines. It uses brain signals which will be trained and detected in form of Electroencephalography (EEG), brain signals will be used to extract and classify human intents. A selected instrument, Emotiv Epoc X, is used for pattern training and recognition based on its controllability and features among other instruments. A functional module is then developed to bridge the gap of frequency differences between human intents and motion commands of machine. A selected robot, TinkerKit Braccio, is then used to illustrate the feasibility of the developed module through fully controlling the robotic arm using human’s intents solely.</p><p dir="ltr">Multiple experiments were done on the prototyped system to prove the feasibility of the proposed model. The accuracy to send each command, and hence the accuracy of the system to extract each intent, exceeded 75%. Then, the feasibility of the proposed model was also tested through controlling the robot to follow pre-defined paths, which was obtained through designing a Graphical-User Interface (GUI). The accuracy of each experiment exceeded 90%, which validated the feasibility of the proposed control model.</p>

  1. 10.25394/pgs.24747216.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/24747216
Date07 December 2023
CreatorsHussein Bilal (17565258)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/_b_Collaborative_Human_and_Computer_Controls_of_Smart_Machines_b_/24747216

Page generated in 0.002 seconds