Stereoscopic augmented reality (AR) displays with a fixed focus plane suffer from visual discomfort due to vergenceaccommodation conflict (VAC). In this study, we propose a biocular dual-focal plane AR system. Two separate liquid crystal displays (LCDs) are placed at slightly different distances to a Fresnel relay lens such that virtual images of LCDs appear at 25 cm and 50 cm to the user. Both LCDs are totally viewed by both eyes, such that the rendered images are not parallax images for each eye. While the system is limited to two depths, it provides correct focus cues and natural blur effect in two distinct depths. This allows the user to distinguish virtual information, even when the virtual objects overlap and partially occlude in the axial direction. Displays are driven by a single computation unit and the objects in the virtual scene are distributed over the LCDs according to their depths. Field-of-view is 60 x 36 degrees and the eye-box is larger than 100 mm, which is comfortable enough for two-eye viewing.
We propose a method for computing realistic computer-generated holograms (CGHs) of three-dimensional (3D) objects, where we benefit from well-established graphical processing units (GPUs) and computer graphics techniques to handle occlusion, shading and parallax effects. The graphics render provides a 2D perspective image including occlusion and shading effects. We also extract the depth map data of the scene. The intensity values and 3D positions of object points are extracted by combining the rendered intensity image and the depth map (Z-buffer) image. We divide the depth range into several planes and quantize the depth value of 3D image points to the nearest plane. In the CGH computation part, we perform proper Fresnel transformations of these planar objects and sum them up to create the hologram corresponding to the particular viewpoint. We then repeat the entire procedure for all possible viewpoints and cover the hologram area. The experimental results show that the technique is capable of performing high quality reconstructions in a fast manner.
In this talk, we present the various types of 3D displays, head-mounted projection displays and wearable displays developed in our group using MEMS scanners, compact RGB laser light sources, and spatial light modulators.
In near to eye displays based on scanning laser projectors, retro-reflectors seem as convenient image relay components since they can ideally be placed at any location on the scanned beam path. In case of practical retro reflectors though, such as corner cube retro-reflectors (CCRs), the relayed image suffers from loss in quality and resolution due to the positional shift in the retro-reflected rays and the diffraction effects. We perform a wave optics simulation to analyze the image relay performance of a CCR. Our model assumes that the scanned spot of the projector is imaged by the CCR into an array of spots, which superpose and interfere to yield the effective scan spot seen by an eye looking at the CCR. The results indicate that the CCR results in a significant broadened spot size. Experimental results verify the simulation model in terms of achievable resolution and image quality.