Head-mounted displays (HMDs) require precise measurement of virtual image distance for user comfort, but this is challenging due to dynamic variations. This paper addresses the difficulty by proposing a prototype using a variable-focus liquid lens and a calculation model for virtual image distance. We developed an experimental platform to validate the method and introduced an optimization algorithm to find the optimal focal length for maximum sharpness. Results showed a distance error of about 5 cm, confirming that our method accurately measures virtual image distance in HMDs, with potential applications in virtual and augmented reality.
A large open aperture in an optical system can capture high-resolution images but yields a shallow depth of field. To overcome this issue, we propose a method for improving microscopy imaging systems by using a variable-focus liquid lens to achieve 3D focus scanning. Specifically, the focal length of the imaging system was changed by the liquid lens, and a sequence of 12 images was captured in different focal planes. The image scale was adjusted according to the change in focal length, and the phase of the image was corrected by the phase only correction method. Then the in-focus pixels were abstracted by employing the Laplacian operator. Finally, an all-in-focus sharp image was generated, and a depth map was obtained. Additionally, to accelerate the processing speed, the Fast Fourier Transform image processing during phase correction was optimized. Meanwhile, we propose a parallel optimization solution for the original processing flow.
Dynamic projection mapping for moving objects has attracted much attention in recent years. However, conventional approaches have faced some issues, such as the target objects being limited to planar objects, the limited moving speed of the targets, and the limitation of their narrow depth of field. Based on the high-speed liquid lens optics, an adaptive three-dimensional projection display method could project an always in-focus image on the target. Meanwhile, the location of the non-planar object could be detected, and calculated the mapped projection contents, as a result, a stable "printed" projection mapping should be viewed on a moving non-planar object.
It is a challenge for conventional monocular-camera single-light source eye tracking methods to achieve high-speed eye tracking. Human gaze motion is a high-speed and miniature eye movement. Eye tracking requires a high-speed sampling frequency. In this work, an eye tracking method was proposed to overcomes the above limitation. The dual-ring infrared lighting source was designed to achieve bright and dark pupils in high-speed. The eye tracking method used a dual-ring infrared lighting source and synchronized triggers for the even and odd camera frames to capture bright and dark pupils. A pupillary corneal reflex was calculated by the center coordinates of the Purkinje spot and the pupil. A map function was established to map the relationship between pupillary corneal reflex and gaze spots. The gaze coordinate was calculated based on the mapping function. The detection time of each frame was less than five milliseconds, which achieved the purpose of high-speed eye tracking of the human gaze.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.