TY - GEN
T1 - Gesture recognition for control in human-robot interactions
AU - Reid, Chris
AU - Samanta, Biswanath
N1 - Publisher Copyright:
Copyright © 2014 by ASME.
PY - 2014
Y1 - 2014
N2 - In co-robotics applications, the robots must be capable of taking inputs from human partners in different forms, including both static and sequential hand gestures, in dynamic interactions for enhanced effectiveness as socially assistive agents. This paper presents the development of a gesture recognition algorithm for control of robots. The algorithm focuses on the detection of skin colors using monocular vision of a moving robot base where the inherent instability negates the effectiveness of methods like background subtraction. The algorithm is implemented in the open-source, open-access robotics software framework of Robot Operating System (ROS). The video feed from the camera is converted into several color spaces, including RGB and YCbCr. Pixels observed in the raw video feed as skin are randomly selected and their properties in each of the color spaces are recorded. A cylinder of infinite length is constructed out of the best fit line for both color spaces, and all points lying within both cylinders are accepted as a skin tone. The gesture recognition features are extracted from the filtered image and can be used for planning the motion of the robot. The procedure is illustrated using the on-board camera on an unmanned aerial vehicle (UAV).
AB - In co-robotics applications, the robots must be capable of taking inputs from human partners in different forms, including both static and sequential hand gestures, in dynamic interactions for enhanced effectiveness as socially assistive agents. This paper presents the development of a gesture recognition algorithm for control of robots. The algorithm focuses on the detection of skin colors using monocular vision of a moving robot base where the inherent instability negates the effectiveness of methods like background subtraction. The algorithm is implemented in the open-source, open-access robotics software framework of Robot Operating System (ROS). The video feed from the camera is converted into several color spaces, including RGB and YCbCr. Pixels observed in the raw video feed as skin are randomly selected and their properties in each of the color spaces are recorded. A cylinder of infinite length is constructed out of the best fit line for both color spaces, and all points lying within both cylinders are accepted as a skin tone. The gesture recognition features are extracted from the filtered image and can be used for planning the motion of the robot. The procedure is illustrated using the on-board camera on an unmanned aerial vehicle (UAV).
UR - http://www.scopus.com/inward/record.url?scp=84926429955&partnerID=8YFLogxK
U2 - 10.1115/IMECE2014-38504
DO - 10.1115/IMECE2014-38504
M3 - Conference article
AN - SCOPUS:84926429955
T3 - ASME International Mechanical Engineering Congress and Exposition, Proceedings (IMECE)
BT - Dynamics, Vibration, and Control
PB - American Society of Mechanical Engineers (ASME)
T2 - ASME 2014 International Mechanical Engineering Congress and Exposition, IMECE 2014
Y2 - 14 November 2014 through 20 November 2014
ER -