TY - JOUR
T1 - RNN Classification of Spectral EEG/EMG Data Associated with Facial Movements for Drone Control
AU - Adebile, Akintomide
AU - Mehrzed, Shaen
AU - Hicks, Destinee
AU - Hale, Joshua
AU - Wiltz, Darryl
AU - Alba-Flores, Rocio
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - This research involves developing a drone control system that functions by relating EEG and EMG from the forehead to different facial movements using recurrent neural networks (RNN) such as long-short term memory (LSTM) and gated recurrent Unit (GRU). As current drone control methods are largely limited to handheld devices, regular operators are actively engaged while flying and cannot perform any passive control. Passive control of drones would prove advantageous in various applications as drone operators can focus on additional tasks. The advantages of the chosen methods and those of some alternative system designs are discussed. For this research, EEG signals were acquired at three frontal cortex locations (fp1, fpz, and fp2) using electrode from an OpenBCI headband and observed for patterns of Fast Fourier Transform (FFT) frequency- amplitude distributions. Five different facial expressions were repeated while recording EEG signals of 0- 60Hz frequencies with two reference electrodes placed on both earlobes. EMG noise received during EEG measurements was not filtered away but was observed to be minimal. A dataset was first created for the actions done, and later categorized by a mean average error (MAE), a statistical error deviation analysis and then classified with both an LSTM and GRU neural network by relating FFT amplitudes to the actions. On average, the LSTM network had classification accuracy of 78.6%, and the GRU network had a classification accuracy of 81.8%.
AB - This research involves developing a drone control system that functions by relating EEG and EMG from the forehead to different facial movements using recurrent neural networks (RNN) such as long-short term memory (LSTM) and gated recurrent Unit (GRU). As current drone control methods are largely limited to handheld devices, regular operators are actively engaged while flying and cannot perform any passive control. Passive control of drones would prove advantageous in various applications as drone operators can focus on additional tasks. The advantages of the chosen methods and those of some alternative system designs are discussed. For this research, EEG signals were acquired at three frontal cortex locations (fp1, fpz, and fp2) using electrode from an OpenBCI headband and observed for patterns of Fast Fourier Transform (FFT) frequency- amplitude distributions. Five different facial expressions were repeated while recording EEG signals of 0- 60Hz frequencies with two reference electrodes placed on both earlobes. EMG noise received during EEG measurements was not filtered away but was observed to be minimal. A dataset was first created for the actions done, and later categorized by a mean average error (MAE), a statistical error deviation analysis and then classified with both an LSTM and GRU neural network by relating FFT amplitudes to the actions. On average, the LSTM network had classification accuracy of 78.6%, and the GRU network had a classification accuracy of 81.8%.
KW - Drone Control
KW - EEG
KW - GRU
KW - LSTM
KW - Machine Learning
KW - RNN
UR - http://www.scopus.com/inward/record.url?scp=85179890020&partnerID=8YFLogxK
U2 - 10.1109/SoutheastCon51012.2023.10115185
DO - 10.1109/SoutheastCon51012.2023.10115185
M3 - Conference article
AN - SCOPUS:85179890020
SN - 1091-0050
JO - Conference Proceedings - IEEE SOUTHEASTCON
JF - Conference Proceedings - IEEE SOUTHEASTCON
T2 - 2023 IEEE SoutheastCon, SoutheastCon 2023
Y2 - 1 April 2023 through 16 April 2023
ER -