Paper
17 May 2013 Visual and tactile interfaces for bi-directional human robot communication
Daniel Barber, Stephanie Lackey, Lauren Reinerman-Jones, Irwin Hudson
Author Affiliations +
Abstract
Seamless integration of unmanned and systems and Soldiers in the operational environment requires robust communication capabilities. Multi-Modal Communication (MMC) facilitates achieving this goal due to redundancy and levels of communication superior to single mode interaction using auditory, visual, and tactile modalities. Visual signaling using arm and hand gestures is a natural method of communication between people. Visual signals standardized within the U.S. Army Field Manual and in use by Soldiers provide a foundation for developing gestures for human to robot communication. Emerging technologies using Inertial Measurement Units (IMU) enable classification of arm and hand gestures for communication with a robot without the requirement of line-of-sight needed by computer vision techniques. These devices improve the robustness of interpreting gestures in noisy environments and are capable of classifying signals relevant to operational tasks. Closing the communication loop between Soldiers and robots necessitates them having the ability to return equivalent messages. Existing visual signals from robots to humans typically require highly anthropomorphic features not present on military vehicles. Tactile displays tap into an unused modality for robot to human communication. Typically used for hands-free navigation and cueing, existing tactile display technologies are used to deliver equivalent visual signals from the U.S. Army Field Manual. This paper describes ongoing research to collaboratively develop tactile communication methods with Soldiers, measure classification accuracy of visual signal interfaces, and provides an integration example including two robotic platforms.
© (2013) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Daniel Barber, Stephanie Lackey, Lauren Reinerman-Jones, and Irwin Hudson "Visual and tactile interfaces for bi-directional human robot communication", Proc. SPIE 8741, Unmanned Systems Technology XV, 87410U (17 May 2013); https://doi.org/10.1117/12.2015956
Lens.org Logo
CITATIONS
Cited by 12 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Visualization

Gyroscopes

Gesture recognition

Robotics

Sensors

Telecommunications

Human-machine interfaces

RELATED CONTENT

Natural interaction for unmanned systems
Proceedings of SPIE (May 22 2015)
Modular telerobot control system for accident response
Proceedings of SPIE (August 26 1999)
Integration of robotic resources into FORCEnet
Proceedings of SPIE (May 12 2006)
Shared control in bilateral telerobotic systems
Proceedings of SPIE (December 21 1995)
Hybrid systems for telepresence: experimental platform design
Proceedings of SPIE (February 05 2002)
Multi-robot operator control unit
Proceedings of SPIE (May 12 2006)
State of the art in nuclear telerobotics focus on...
Proceedings of SPIE (December 21 1995)

Back to Top