You have requested a machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Neither SPIE nor the owners and publishers of the content make, and they explicitly disclaim, any express or implied representations or warranties of any kind, including, without limitation, representations and warranties as to the functionality of the translation feature or the accuracy or completeness of the translations.
Translations are not retained in our system. Your use of this feature and the translations is subject to all use restrictions contained in the Terms and Conditions of Use of the SPIE website.
10 May 2012Using arm and hand gestures to command robots during stealth operations
Command of support robots by the warfighter requires intuitive interfaces to quickly communicate high degree-offreedom
(DOF) information while leaving the hands unencumbered. Stealth operations rule out voice commands and
vision-based gesture interpretation techniques, as they often entail silent operations at night or in other low visibility
conditions. Targeted at using bio-signal inputs to set navigation and manipulation goals for the robot (say, simply by
pointing), we developed a system based on an electromyography (EMG) "BioSleeve", a high density sensor array for
robust, practical signal collection from forearm muscles. The EMG sensor array data is fused with inertial measurement
unit (IMU) data. This paper describes the BioSleeve system and presents initial results of decoding robot commands
from the EMG and IMU data using a BioSleeve prototype with up to sixteen bipolar surface EMG sensors. The
BioSleeve is demonstrated on the recognition of static hand positions (e.g. palm facing front, fingers upwards) and on
dynamic gestures (e.g. hand wave). In preliminary experiments, over 90% correct recognition was achieved on five static
and nine dynamic gestures. We use the BioSleeve to control a team of five LANdroid robots in individual and
group/squad behaviors. We define a gesture composition mechanism that allows the specification of complex robot
behaviors with only a small vocabulary of gestures/commands, and we illustrate it with a set of complex orders.
The alert did not successfully save. Please try again later.
Adrian Stoica, Chris Assad, Michael Wolf, Ki Sung You, Marco Pavone, Terry Huntsberger, Yumi Iwashita, "Using arm and hand gestures to command robots during stealth operations," Proc. SPIE 8407, Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2012, 84070G (10 May 2012); https://doi.org/10.1117/12.923690