Return to search

BRAIN-COMPUTER INTERFACE FOR SUPERVISORY CONTROLS OF UNMANNED AERIAL VEHICLES

<p dir="ltr">This research explored a solution to a high accident rate in remotely operating Unmanned Aerial Vehicles (UAVs) in a complex environment; it presented a new Brain-Computer Interface (BCI) enabled supervisory control system to fuse human and machine intelligence seamlessly. This study was highly motivated by the critical need to enhance the safety and reliability of UAV operations, where accidents often stemmed from human errors during manual controls. Existing BCIs confronted the challenge of trading off a fully remote control by humans and an automated control by computers. This study met such a challenge with the proposed supervisory control system to optimize human-machine collaboration, prioritizing safety, adaptability, and precision in operation.</p><p dir="ltr">The research work included designing, training, and testing BCI and the BCI-enabled control system. It was customized to control a UAV where the user’s motion intents and cognitive states were monitored to implement hybrid human and machine controls. The DJI Tello drone was used as an intelligent machine to illustrate the application of the proposed control system and evaluate its effectiveness through two case studies. The first case study was designed to train a subject and assess the confidence level for BCI in capturing and classifying the subject’s motion intents. The second case study illustrated the application of BCI in controlling the drone to fulfill its missions.</p><p dir="ltr">The proposed supervisory control system was at the forefront of cognitive state monitoring to leverage the power of an ML model. This model was innovative compared to conventional methods in that it could capture complicated patterns within raw EEG data and make decisions to adopt an ensemble learning strategy with the XGBoost. One of the key innovations was capturing the user’s intents and interpreting these into control commands using the EmotivBCI app. Despite the headset's predefined set of detectable features, the system could train the user’s mind to generate control commands for all six degrees of freedom of adapting to the quadcopter by creatively combining and extending mental commands, particularly in the context of the Yaw rotation. This strategic manipulation of commands showcased the system's flexibility in accommodating the intricate control requirements of an automated machine.</p><p dir="ltr">Another innovation of the proposed system was its real-time adaptability. The supervisory control system continuously monitors the user's cognitive state, allowing instantaneous adjustments in response to changing conditions. This innovation ensured that the control system was responsive to the user’s intent and adept at prioritizing safety through the arbitrating mechanism when necessary.</p>

  1. 10.25394/pgs.25219850.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/25219850
Date15 February 2024
CreatorsAbdelrahman Osama Gad (17965229)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/BRAIN-COMPUTER_INTERFACE_FOR_SUPERVISORY_CONTROLS_OF_UNMANNED_AERIAL_VEHICLES/25219850

Page generated in 0.1834 seconds