TY - GEN
T1 - Mechanical design and control calibration for an interactive animatronic system
AU - Burns, Brian
AU - Samanta, Biswanath
N1 - Publisher Copyright:
Copyright © 2015 by ASME.
PY - 2015
Y1 - 2015
N2 - Animatronic figures provide key show effects in the entertainment and theme park industry by simulating life-like animations and sounds. There is a need for interactive, autonomous animatronic systems to create engaging and compelling experiences for the guests. The animatronic figures must identify the guests and recognize their status in dynamic interactions for enhanced acceptance and effectiveness as socially interactive agents, in the general framework of humanrobot interactions. The design and implementation of an interactive, autonomous animatronic system in form of a tabletop dragon and the comparisons of guest responses in its passive and interactive modes are presented in this work. The purpose of this research is to create a platform that may be used to validate autonomous, interactive behaviors in animatronics, utilizing both quantitative and qualitative analysis methods of guest response. The dragon capabilities include a four degreesof- freedom head, moving wings, tail, jaw, blinking eyes and sound effects. Human identification, using a depth camera (Carmine from PrimeSense), an open-source middleware (NITE from OpenNI), Java-based Processing and an Arduino microcontroller, has been implemented into the system in order to track a guest or guests, within the field of view of the camera. The details of design and fabrication of the dragon model, algorithm development for interactive autonomous behavior using a vision system, the experimental setup and implementation results under different conditions are presented.
AB - Animatronic figures provide key show effects in the entertainment and theme park industry by simulating life-like animations and sounds. There is a need for interactive, autonomous animatronic systems to create engaging and compelling experiences for the guests. The animatronic figures must identify the guests and recognize their status in dynamic interactions for enhanced acceptance and effectiveness as socially interactive agents, in the general framework of humanrobot interactions. The design and implementation of an interactive, autonomous animatronic system in form of a tabletop dragon and the comparisons of guest responses in its passive and interactive modes are presented in this work. The purpose of this research is to create a platform that may be used to validate autonomous, interactive behaviors in animatronics, utilizing both quantitative and qualitative analysis methods of guest response. The dragon capabilities include a four degreesof- freedom head, moving wings, tail, jaw, blinking eyes and sound effects. Human identification, using a depth camera (Carmine from PrimeSense), an open-source middleware (NITE from OpenNI), Java-based Processing and an Arduino microcontroller, has been implemented into the system in order to track a guest or guests, within the field of view of the camera. The details of design and fabrication of the dragon model, algorithm development for interactive autonomous behavior using a vision system, the experimental setup and implementation results under different conditions are presented.
UR - http://www.scopus.com/inward/record.url?scp=84982899200&partnerID=8YFLogxK
U2 - 10.1115/IMECE201552477
DO - 10.1115/IMECE201552477
M3 - Conference article
AN - SCOPUS:84982899200
T3 - ASME International Mechanical Engineering Congress and Exposition, Proceedings (IMECE)
BT - Dynamics, Vibration, and Control
PB - American Society of Mechanical Engineers (ASME)
T2 - ASME 2015 International Mechanical Engineering Congress and Exposition, IMECE 2015
Y2 - 13 November 2015 through 19 November 2015
ER -