Hand Prosthesis Control using Electromyographic Signal Trained Neural Network for Live Gesture Classification

Zackary Minshew, Dr Rocio Alba-Flores

Research output: Contribution to book or proceedingConference articlepeer-review

Abstract

- This work addresses the use of a user-friendly app to implement the surface Electromyographic (sEMG) sensors from the Myo Armband by Thalmic Labs and a pattern recognition neural network to control a robotic hand. The Myo Armband consists of eight electrodes that are placed on the forearm of the subject that performs wrist and finger motions. The ANN was trained to recognize five hand gestures, named Fist, Rest, Spread, Wave In, and Wave Out. The output of the ANN is then used to control a dexterous robotic hand, 6 DOF, that was 3D printed using ABS plastic material. The robotic hand was able to perform in real time the same hand gestures that the subject performed. The pattern recognition system was able to classify the hand motions with an accuracy of 81.4%. Presented in this document is a simple project design that demonstrates how current affordable technologies could be used in applications such as human-robot interfaces, and in the rehabilitation and solutions for amputees.

Original languageEnglish
Title of host publication2019 IEEE SoutheastCon, SoutheastCon 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728101378
DOIs
StatePublished - Apr 2019
Event2019 IEEE SoutheastCon, SoutheastCon 2019 - Huntsville, United States
Duration: Apr 11 2019Apr 14 2019

Publication series

NameConference Proceedings - IEEE SOUTHEASTCON
Volume2019-April
ISSN (Print)1091-0050
ISSN (Electronic)1558-058X

Conference

Conference2019 IEEE SoutheastCon, SoutheastCon 2019
Country/TerritoryUnited States
CityHuntsville
Period04/11/1904/14/19

Keywords

  • Artificial Neural Network
  • Myo armband
  • Pattern Recognition
  • prosthetic control
  • sEMG

Fingerprint

Dive into the research topics of 'Hand Prosthesis Control using Electromyographic Signal Trained Neural Network for Live Gesture Classification'. Together they form a unique fingerprint.

Cite this