Abstract
This research involves developing a drone control system that functions by relating EEG and EMG from the forehead to different facial movements using recurrent neural networks (RNN) such as long-short term memory (LSTM) and gated recurrent Unit (GRU). As current drone control methods are largely limited to handheld devices, regular operators are actively engaged while flying and cannot perform any passive control. Passive control of drones would prove advantageous in various applications as drone operators can focus on additional tasks. The advantages of the chosen methods and those of some alternative system designs are discussed. For this research, EEG signals were acquired at three frontal cortex locations (fp1, fpz, and fp2) using electrode from an OpenBCI headband and observed for patterns of Fast Fourier Transform (FFT) frequency- amplitude distributions. Five different facial expressions were repeated while recording EEG signals of 0- 60Hz frequencies with two reference electrodes placed on both earlobes. EMG noise received during EEG measurements was not filtered away but was observed to be minimal. A dataset was first created for the actions done, and later categorized by a mean average error (MAE), a statistical error deviation analysis and then classified with both an LSTM and GRU neural network by relating FFT amplitudes to the actions. On average, the LSTM network had classification accuracy of 78.6%, and the GRU network had a classification accuracy of 81.8%.
Original language | English |
---|---|
Journal | Conference Proceedings - IEEE SOUTHEASTCON |
DOIs | |
State | Published - 2023 |
Event | 2023 IEEE SoutheastCon, SoutheastCon 2023 - Orlando, United States Duration: Apr 1 2023 → Apr 16 2023 |
Scopus Subject Areas
- Computer Networks and Communications
- Software
- Electrical and Electronic Engineering
- Control and Systems Engineering
- Signal Processing
Keywords
- Drone Control
- EEG
- GRU
- LSTM
- Machine Learning
- RNN