Paper
3 May 2000 Unencumbered interaction in display environments with extended working volume
Ulrich Hafner, Andreas Simon, Mario Doulis
Author Affiliations +
Abstract
The paper gives an overview of wireless input system and corresponding interaction techniques. We investigate the use of wireless input systems for immersive interaction in a large scale virtual environment displayed on a 6.4m X 2m stereoscopic projection system. The system is used to present a compete car body at full scale, allowing users to walk up to 6m in front of the virtual object. The working volume needed for immersive interaction in this scenario is much larger than that typically realized by a HMD or CAVE making the use of cable bound devices problematic. Interactions realized in this environment include: Accurate head tracing to allow high quality undistorted stereoscopic rendering at natural scale. Head tracking for navigation as intuitive interaction by walking around. Positioning the car on the ground. Menu selection for different preselected models, assemblies or environments. Control of the virtual lighting situation. For head tracking a cluster of commercially available optical trackers with a single passive reflective target is used. For manual interaction wireless button devices with inertial sensor for acceleration and spin have been built. The application of these devices and the combination of input channels in different interactions is presented.
© (2000) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Ulrich Hafner, Andreas Simon, and Mario Doulis "Unencumbered interaction in display environments with extended working volume", Proc. SPIE 3957, Stereoscopic Displays and Virtual Reality Systems VII, (3 May 2000); https://doi.org/10.1117/12.384475
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Virtual reality

Sensors

Human-machine interfaces

Optical tracking

Head

Gyroscopes

Target recognition

Back to Top