Paper
5 March 2007 Tangible mixed reality desktop for digital media management
Stefan Winkler, Hang Yu, ZhiYing Zhou
Author Affiliations +
Proceedings Volume 6490, Stereoscopic Displays and Virtual Reality Systems XIV; 64901S (2007) https://doi.org/10.1117/12.703906
Event: Electronic Imaging 2007, 2007, San Jose, CA, United States
Abstract
This paper presents a tangible mixed reality desktop that supports gesture-oriented interactions in 3D space. The system is based on computer vision techniques for hand and finger detection, without the need for attaching any devices to the user. The system consists of a pair of stereo cameras that point to a planar surface as the work bench. Using stereo triangulation, the 3D locations and directions of the user's fingers are detected and tracked in the space on and above the surface. Based on our 3D finger tracking technique, we design a few simple multi-finger gestural interactions for digital media management. The system provides a convenient and user-friendly way of manipulating virtual objects in 3D space and supports seamless interactions with physical objects.
© (2007) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Stefan Winkler, Hang Yu, and ZhiYing Zhou "Tangible mixed reality desktop for digital media management", Proc. SPIE 6490, Stereoscopic Displays and Virtual Reality Systems XIV, 64901S (5 March 2007); https://doi.org/10.1117/12.703906
Lens.org Logo
CITATIONS
Cited by 7 scholarly publications and 2 patents.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cameras

Stereoscopic cameras

Imaging systems

Detection and tracking algorithms

Human-machine interfaces

Calibration

Image segmentation

RELATED CONTENT


Back to Top