Paper
1 March 1990 Cooperative Integration Of Vision And Touch
Peter K. Allen
Author Affiliations +
Proceedings Volume 1198, Sensor Fusion II: Human and Machine Strategies; (1990) https://doi.org/10.1117/12.969990
Event: 1989 Symposium on Visual Communications, Image Processing, and Intelligent Robotics Systems, 1989, Philadelphia, PA, United States
Abstract
Vision and touch have proved to be powerful sensing modalities in humans. In order to build robots capable of complex behavior, analogues of human vision and taction need to be created. In addition, strategies for intelligent use of these sensors in tasks such as object recognition need to be developed. Two overriding principles that dictate a good strategy for cooperative use of these sensors are the following: 1) sensors should complement each other in the kind and quality of data they report, and 2) each sensor system be used in the most robust manner possible. We demonstrate this with a contour following algorithm that recovers the shape of surfaces of revolution from sparse tactile sensor data. The absolute location in depth of an object can be found more accurately through touch than vision; but the global properties of where to actively explore with the hand are better found through vision.
© (1990) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Peter K. Allen "Cooperative Integration Of Vision And Touch", Proc. SPIE 1198, Sensor Fusion II: Human and Machine Strategies, (1 March 1990); https://doi.org/10.1117/12.969990
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Image processing

Robotic systems

Robotics

Sensor fusion

Cameras

Visualization

RELATED CONTENT


Back to Top