TY - GEN
T1 - Gesture Recognition Using an EEG Sensor and an ANN Classifier for Control of a Robotic Manipulator
AU - Alba-Flores, Rocio
AU - Rios, Fernando
AU - Triplett, Stephanie
AU - Casas, Antonio
N1 - Publisher Copyright:
© 2019, Springer Nature Switzerland AG.
PY - 2019
Y1 - 2019
N2 - In recent years, electroencephalography (EEG) has gained popularity in the field of brain-computer interface (BCI). Current applications of BCI include control of prosthetics and robotic systems. In this project, the goal is to acquire and record EEG signals generated by human subjects performing specific facial gestures, and use them to control a robotic hand. Six facial gestures have been selected for this project: smile, raise eyebrows, look right, look left, hard blink, and blink. Once the signals were collected, a classification system based on artificial neural network (ANN) was designed. The classification system was able to recognize and differentiate each gesture with an accuracy of 98% for signals from a single person, and 75% for signals from multiple persons. The EEG signals were acquired using an Emotiv EPOC headset that has 14 sensors. This headset was selected mainly because its portability, affordable cost compared to similar products in the market, and it is easy to place on the subject’s head. The ultimate purpose of this research is to use the classification system output to send control signals to a robotic hand that has been designed and built in our research lab. In this paper, the data collection, data conditioning, design, testing and results of the classification system are provided in detail.
AB - In recent years, electroencephalography (EEG) has gained popularity in the field of brain-computer interface (BCI). Current applications of BCI include control of prosthetics and robotic systems. In this project, the goal is to acquire and record EEG signals generated by human subjects performing specific facial gestures, and use them to control a robotic hand. Six facial gestures have been selected for this project: smile, raise eyebrows, look right, look left, hard blink, and blink. Once the signals were collected, a classification system based on artificial neural network (ANN) was designed. The classification system was able to recognize and differentiate each gesture with an accuracy of 98% for signals from a single person, and 75% for signals from multiple persons. The EEG signals were acquired using an Emotiv EPOC headset that has 14 sensors. This headset was selected mainly because its portability, affordable cost compared to similar products in the market, and it is easy to place on the subject’s head. The ultimate purpose of this research is to use the classification system output to send control signals to a robotic hand that has been designed and built in our research lab. In this paper, the data collection, data conditioning, design, testing and results of the classification system are provided in detail.
KW - Brain computer interface (BCI)
KW - Electroencephalography (EEG)
KW - Face gesture detection
KW - Neural network classifier
KW - Robotic arm
UR - http://www.scopus.com/inward/record.url?scp=85069485761&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-22868-2_81
DO - 10.1007/978-3-030-22868-2_81
M3 - Conference article
AN - SCOPUS:85069485761
SN - 9783030228675
T3 - Advances in Intelligent Systems and Computing
SP - 1181
EP - 1186
BT - Intelligent Computing - Proceedings of the 2019 Computing Conference
A2 - Arai, Kohei
A2 - Bhatia, Rahul
A2 - Kapoor, Supriya
PB - Springer Verlag
T2 - Computing Conference, 2019
Y2 - 16 July 2019 through 17 July 2019
ER -