KEYWORDS: 3D displays, 3D image processing, Integral imaging, Imaging arrays, Displays, 3D image reconstruction, Tablets, Image processing, Image quality, Distortion
In this paper, we present a technique to generate an elemental image array to match display devices for three dimensional integral imaging. Experimental results show that our technique can be used to accurately match different display formats and improve the display results.
In this paper, we overview a high resolution three-dimensional (3D) holographic display using 2D images captured in an integral imaging system and dense ray resampling technique. Holograms are generated the resampled rays from the 2D images. This method can improve the display resolution because each object is captured in focus and light-ray information is interpolated and resampled with high density on ray-sampling plane located near the object. Numerical experimental results for different scenes show that the presented holographic display technique can reconstruct multiple objects at different depths with higher resolution compared to conventional integral imaging-based holographic displays.
KEYWORDS: Integral imaging, 3D image processing, Imaging systems, Error analysis, Sensors, 3D modeling, Statistical analysis, Statistical modeling, 3D image reconstruction, 3D displays
In this paper, we propose a Bayesian framework to infer depths of object surfaces in a 3D integral imaging system. In a 3D integral imaging system, the depth of Lambertian surfaces can be estimated from the statistics of the spectral radiation pattern. However, the estimated depth may contain errors due to system uncertainties. To better infer the depth information, we utilize a Bayesian framework and a Markov Random Field (MRF) model with the knowledge of the statistical information of object intensities and the assumption that object surfaces are smooth. In the proposed method, we combine a Bayesian framework and the characteristics of 3D integral imaging systems to infer the depths. Simulated and experimental results illustrate the performance of the proposed method.
In integral holography, the reconstructed 3D image quality is affected by lenses positional errors in micro-lens array. We analyzed the spatial distortion effects in reconstructed 3D integral Fourier holographic image which are caused by misarrangements of elemental lenses in micro-lens array. Then, an intermediate projection views generation method is used to eliminate the spatial distortion effects in reconstruction. This method provides a solution to adjust the lens-array manufactured errors in realistic integral holographic imaging.
KEYWORDS: Integral imaging, Cameras, Imaging systems, 3D image processing, 3D acquisition, Compressed sensing, Sensors, Digital holography, Optical filters, Stereoscopy
In this keynote address paper, we present an overview of our previously published work on using compressive sensing in multi-dimensional imaging. We shall examine a variety of multi dimensional imaging approaches and applications, including 3D multi modal imaging integrated with polarimetric and multi spectral imaging, integral imaging and digital holography. This Keynote Address paper is an overview of our previously reported work on 3D imaging with compressive sensing.
KEYWORDS: Polarimetry, 3D image processing, Polarization, 3D image reconstruction, Integral imaging, Imaging systems, Image sensors, Sensors, 3D metrology, Computing systems
In this paper, we overview a 3D polarimetric imaging system by using integral imaging techniques under natural
illumination conditions. To obtain polarimetric information of objects, the Stokes polarization parameters are first measured
and then utilized to calculate degree of polarization of the objects. Based on degree of polarization information of each 2D
image, a modified computational reconstruction method is presented to perform 3D polarimetric image reconstruction. The
system may be used to detect or classify objects with distinct polarization signatures in 3D space. Experimental results also
show the proposed system may mitigate the effect of occlusion in 3D reconstruction.
An axially distributed sensing system is a 3D sensing and imaging where the sensors are distributed along the optical axis.
In this system, a prior knowledge of exact sensor positions was required for 3D volume image reconstruction. In this paper,
we overview unknown sensor position estimation method and present an axially distributed sensing with unknown sensor
positions. Experiments illustrate the feasibility of the proposed system and show this new system may improve the visual
quality of 3D reconstructed images.
In this paper, we overview tracking methods of 3D occluded objects in 3D integral imaging. Two methods based on
Summation of Absolute Difference (SAD) algorithm and Bayesian framework, respectively, are presented. For the
tracking method based on SAD, we calculate SAD between pixels of consecutive frames of a moving object for 3D
tracking. For the tracking method based on Bayesian framework, posterior probabilities of the reconstructed scene
background and the 3D objects are calculated by defining their pixel intensities as Gaussian and Gamma distributions,
respectively, and by assuming appropriate prior distributions for estimated parameters. Multi-objects tracking is
achieved by maximizing the geodesic distance between the log-likelihood of the background and the objects.
Experimental results demonstrate 3D tracking of occluded objects.
KEYWORDS: Sensors, Integral imaging, Cameras, 3D image processing, Image sensors, 3D image reconstruction, Calibration, 3D metrology, 3D modeling, Reconstruction algorithms
Integral imaging is a 3D sensing and imaging technique. Conventional 3D integral imaging systems require that all the
sensor positions in the image capture stage are known. But in certain image pick up geometries, it may be difficult to
obtain accurate measurement of sensor positions such as sensors on moving platforms and/or randomly distributed
sensors. In this paper, we present a 3D integral imaging method with unknown sensor positions. In the proposed method,
all the sensors are randomly distributed on a plane with parallel optical axes. More, only the relative position of any two
sensors is needed whereas all other sensor positions are unknown. We combine image correspondences extraction,
camera perspective model, two view geometry and computational integral imaging 3D reconstruction techniques to
estimate the unknown sensor positions and reconstruct 3D images. The experiment results executed both in lab and
outside show the feasibility of the proposed method in 3D integral imaging. Furthermore, the experiments indicate that
the quality of reconstructed images by using the proposed sensor position estimation algorithm can be improved
compared to the ones by using the physical measurements of the sensor positions.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.