TY - GEN
T1 - A Brain-Computer Interface Drone Control System Based on Human Cognitive Assessment
AU - Lu, Pei Shin
AU - Ko, Li Wei
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - This paper presents the design of a brain-computer interface (BCI) system that receives and analyses electroencephalography (EEG) signals to determine commands for drone control. To achieve this, we used an 8-channel wearable EEG headset (Fp1, Fp2, Fz, C3, C4, Pz, O1, O2) to establish a personal database. It is primarily used to classify the intention for left or right eyes movement, allowing the subject to control the drone's direction. The model was applied during real-Time testing. The data underwent preprocessing using a bandpass filter. The system recorded the subject's relaxed EEG data for 10 seconds to establish a baseline, which was subsequently used to determine the related drone commands. After the drone took off automatically, we recorded and analyzed the brainwave state every 3 seconds. We then transmitted the corresponding action commands to the DJI Tello drone to control its movements. Currently, we have designed seven different action commands based on EEG data for drone movements, including forward, backward, upward, downward, leftward, rightward, and landing. The accuracy of control commands is approximately 70%. The system aims to create a drone system that is operable by anyone. It is expected that this system can be applied to fields such as neurorehabilitation, exoskeleton manipulation, and the defense industry in the future.
AB - This paper presents the design of a brain-computer interface (BCI) system that receives and analyses electroencephalography (EEG) signals to determine commands for drone control. To achieve this, we used an 8-channel wearable EEG headset (Fp1, Fp2, Fz, C3, C4, Pz, O1, O2) to establish a personal database. It is primarily used to classify the intention for left or right eyes movement, allowing the subject to control the drone's direction. The model was applied during real-Time testing. The data underwent preprocessing using a bandpass filter. The system recorded the subject's relaxed EEG data for 10 seconds to establish a baseline, which was subsequently used to determine the related drone commands. After the drone took off automatically, we recorded and analyzed the brainwave state every 3 seconds. We then transmitted the corresponding action commands to the DJI Tello drone to control its movements. Currently, we have designed seven different action commands based on EEG data for drone movements, including forward, backward, upward, downward, leftward, rightward, and landing. The accuracy of control commands is approximately 70%. The system aims to create a drone system that is operable by anyone. It is expected that this system can be applied to fields such as neurorehabilitation, exoskeleton manipulation, and the defense industry in the future.
KW - Brain computer interface
KW - Drone Control
KW - Electroencephalography
KW - Machine Learning
UR - http://www.scopus.com/inward/record.url?scp=85202442405&partnerID=8YFLogxK
U2 - 10.1109/ICSSE61472.2024.10608858
DO - 10.1109/ICSSE61472.2024.10608858
M3 - Conference contribution
AN - SCOPUS:85202442405
T3 - 2024 International Conference on System Science and Engineering, ICSSE 2024
BT - 2024 International Conference on System Science and Engineering, ICSSE 2024
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 International Conference on System Science and Engineering, ICSSE 2024
Y2 - 26 June 2024 through 28 June 2024
ER -