Open Access
23 July 2015 Extended depth-of-field in integral-imaging pickup process based on amplitude-modulated sensor arrays
Author Affiliations +
Abstract
We implement a depth-of-field (DOF) extending pickup experiment of integral imaging based on amplitude-modulated sensor arrays (SAs). By implementing the amplitude-modulating technique on the SA in the optical pickup process, we can modulate the light intensity distribution in the imaging space. Therefore, the central maximum of the Airy pattern becomes narrower and the DOF is enlarged. The experimental results obtained from the optical pickup process and the computational reconstruction process demonstrate the effectiveness of the DOF extending method. We present that the DOF extending pickup method is more suitable for enhancing the DOF of three-dimensional scenes with small depth ranges.

1.

Introduction

Integral imaging is a three-dimensional (3-D) sensing and display technique, which was first proposed by Lippmann in 1908.1 Unlike stereoscopic 3-D display or holography,26 integral imaging can provide full-parallax and continuous-viewing 3-D images without using any special glasses or coherent light.711 An integral-imaging system utilizes a microlens array (MLA) to pick up and reconstruct lifelike true 3-D images. One of the main problems in integral imaging is its limited depth-of-field (DOF); many researchers have proposed some useful methods to improve this.1216 One approach to overcome this problem is to reduce the numerical aperture of the microlenses. However, such a reduction would reduce the lateral resolution of the elemental images.17

It has been known for some time that obstructing the center of the aperture of an optical system—i.e., using an annular aperture—makes the central maximum of the Airy pattern narrower and increases the DOF.18,19 Also, Martínez-Corral et al.20 proposed an amplitude-modulating method and presented a simulating experiment to show that the DOF of an integral-imaging pickup system was significantly enhanced by simply placing an opaque circular mask behind each microlens. But, to our knowledge, no optical integral-imaging pickup experiments have been presented to verify this method so far.

In this paper, we analyze the light intensity distribution in the amplitude-modulating pickup system and implement an optical pickup experiment using an amplitude-modulated sensor array (SA) to generate the DOF-enhanced elemental image array (EIA). The obtained EIA is then used for computational reconstruction to produce the 3-D images with extended DOF.

2.

Intensity Distribution in Integral-Imaging Pickup Process

We assume that the integral-imaging system is linear and shift invariant and is illuminated by a monochromatic light source with a wave length λ.

Figure 1 shows the schematic of the DOF-extending method in the integral-imaging pickup process and an opaque mask with diameter q is placed in front of each sensor to obstruct the central part of that sensor. The 3-D object point O(x0,y0) is located out of the reference plane with a depth value z0 and produces a blurred image on the complementary metal-oxide-semiconductor (CMOS). The central sensor is denoted as the (0th, 0th) sensor. The pitch of the SA is p and the focal length of the SA is f. Distances l and g are related by the Gaussian lens law 1/l+1/g1/f=0.

Fig. 1

Schematic of the depth-of-field (DOF) extending method in integral-imaging pickup process.

OE_54_7_073108_f001.png

Pupil function for the (0th, 0th) sensor in Fig. 1 is given by

Eq. (1)

P00(x,y)=Circ(x,y;p)Circ(x,y;q),
where p>q0, function

Eq. (2)

Circ(x,y;p)={1x2+y2p/20otherwise
represents the amplitude transmittance of a circular aperture.

Accordingly, the pupil function for the (m’th, n’th) sensor can be expressed as Pmn(x,y)=P00(xmp,ynp). Here, m and n account for the indices of that sensor. For any monochromatic channel with a wavelength λ, the phase transformation of the (m’th, n’th) sensor is written as

Eq. (3)

Tmn(x,y)=P00(xmp,ynp)exp{jk2f[(xmp)2+(ynp)2]},
where k refers to the wave number and is given by k=2π/λ.

According to the paraxial approximation and the Fresnel diffraction theory, the light intensity distribution on the CMOS (x,y) can be obtained as

Eq. (4)

I(x,y;z0)=|1λ2g(l+z0)++exp{jk2(l+z0)[(xx0)2+(yy0)2]}×Tmn(x,y)×exp{jk2g[(xx)2+(yy)2]}dxdy|2.

Here, the external pure phase factors have been dropped.

Note that the DOF-extending method works at the expense of losing the light efficiency, and the light efficiency is given by

Eq. (5)

η=p2q2p2×100%.

3.

Depth-of-Field of the Integral-Imaging Pickup System

For the DOF calculation, we only take into account the rear DOF that is behind the reference plane. As for an object point with a certain depth value z0, its diffraction intensity pattern on the CMOS can be computed by using Eq. (4) and for the diffraction intensity pattern, we define its diameter as the one of a circle where the intensity has dropped by a factor of 1/21/2 compared to the maximum intensity of that pattern. Similarly, for different object points with different depth values z0, we can work out a group of distinct diameters. Therefore, when the system parameters are given as p=8.8mm, q=6.2mm, f=50.0mm, g=60.4mm, λ=5.5×104mm, we calculate two groups of pattern diameters with different depth values z0 for the DOF-extending method (red line) and the conventional method (green line), respectively, as shown in Fig. 2. From Fig. 2, we can see that the minimal pattern diameter on the CMOS is obtained when the object point is located on the reference plane (z0=0), and as the object point goes away from the reference plane (z0 increases), the diameter gradually increases.

Fig. 2

DOF of the DOF-extending method and the conventional method.

OE_54_7_073108_f002.png

The DOF of the integral-imaging pickup system can be defined as the distance in which the object may be axially shifted before an intolerable blur is produced.21 The size of the critical tolerable pattern is given by the combination of the least distance of distinct vision of a normal eye (about 250.0 mm)22 and the minimum angular resolution of human eyes (about 2.9×104rad),17 and the tolerable pattern diameter is obtained as 72.5μm (blue line shown in Fig. 2). Therefore, the DOF of both the DOF-extending and conventional methods can be obtained as the abscissas of the intersections of the blue line with the red line and the green line, respectively, as shown in Fig. 2. The results show that the DOF of the DOF-extending method (about 60.0 mm) almost increased by a factor of 2 over one of the conventional methods (about 30.0 mm) with an obscuration ratio q/p1/21/2. The light efficiency can be obtained as 50.4% according to Eq. (5).

4.

Experiments and Discussions

To further verify the effectiveness of the DOF extending pickup method based on amplitude-modulated SA, we implemented an optical pickup experiment under white-light illumination and a computational reconstruction experiment. As shown in Fig. 3, a Canon EOS 60D sensor with a Canon EF-S 18 to 55 mm f/3.5 to 5.6 IS II lens was fixed onto a Lyseiki motorized translation stage to perform the optical pickup process. The stage was driven by the stage controller to move on both horizontal and vertical directions step by step with a stepping length of 5.0 mm. The focal length, exposure time, ISO, F-number, and CMOS size were 50.0 mm, 1/25s, 1000, F/5.6, and 22.3mm×14.9mm, respectively. The sensor was focusing on the first object and the distance between the sensor CMOS and the first object was 350.0 mm. According to the Gaussian lens law, the distance between the sensor CMOS and the sensor objective equivalent principal plane was 60.4 mm and the distance between the sensor objective equivalent principal plane and object 1 was 289.6 mm.

Fig. 3

Experimental setup of the optical integral-imaging pickup process.

OE_54_7_073108_f003.png

Note that the opaque mask for amplitude modulating should be placed exactly on the aperture plane of the sensor objective, which was not accessible in our setup. Thus, we introduced an additional diaphragm, whose diameter was set to be 8.8 mm, onto the first surface of the objective to shift the aperture plane to the first surface. As shown in Figs. 4(a) and 4(b), an 8.8-mm aperture stop and a 6.2-mm opaque mask were printed on a photographic film for the optical integral-imaging pickup process. According to the DOF results in Fig. 2 and Sec. 3, the rear DOFs of the DOF-extending and conventional experimental setup can be obtained as 60.0 mm and 30.0 mm, respectively.

Fig. 4

(a) The 8.8-mm aperture stop, (b) the aperture stop and the 6.2-mm opaque mask, and (c) the 3-D object used in the optical integral-imaging pickup process.

OE_54_7_073108_f004.png

As shown in Fig. 4(c), we built a 3-D object which consists of three planar objects located at different depth positions. The distance between every two adjacent objects was 30.0 mm. To make sure that the pickup system has the same angular resolution at three different planar objects, the lateral sizes of object 1, object 2, and object 3 were designed properly as 2.5 mm, 2.8 mm and 3.0 mm, respectively. For each method, there were 7×7 images captured by the SA as the original elemental images and each had a resolution of 5184×3456 pixels. Table 1 shows the parameters used in the experiment.

Table 1

Parameters used in the experiment.

ParametersValues
Focal length of the sensor lens50.0 mm
F-numberF/5.6
Diameter of the opaque mask6.2 mm
Diameter of the aperture stop8.8 mm
Sensor step size5.0 mm
Focal length of the virtual MLA22.0 mm
Pitch of the virtual MLA5.0 mm
Distance between every two adjacent 3-D objects30.0 mm
Distance between the CMOS and object 1350.0 mm
Number of the captured elemental images7×7
Resolution of the original elemental images5184×3456pixels
Resolution of the shrunk elemental images1000×1000pixels
Resolution of the EIA7000×7000pixels
Note: MLA, microlens array; EIA, elemental image array.

Since the virtual MLA used in the computational reconstruction process had a focal length f=22.0mm and a pitch p=5.0mm, the obtained original elemental images need to be resized and shrunk to have a resolution of 1000×1000pixels. Figure 5 shows the obtained EIAs and the enlarged elemental images. Each EIA contains 7×7 elemental images and has a resolution of 7000×7000pixels. It can be seen that the EIA and elemental images obtained from the DOF-extending method are much clear than those of the conventional method, especially for three black fringes with high-contrast ratio. However, the light efficiency has been decreased due to the obscuration of the central part of each sensor.

Fig. 5

Elemental image arrays and enlarged elemental images obtained from (a) the DOF extending and (b) the conventional optical integral-imaging pickup methods.

OE_54_7_073108_f005.png

Also, to demonstrate the DOF-extending effect more intuitively, we took the two enlarged elemental images in Fig. 5, for example, and plotted the normalized intensity profiles of three objects along the sampling path shown in Fig. 6(a). As shown in Fig. 6(b), the normalized intensity profiles of object 1 obtained by the DOF-extending method [red lines in Fig. 6(b)] are quite similar to the one obtained by the conventional method [blue lines in Fig. 6(b)], which means that object 1 is recorded clearly with sharp edges and high-contrast ratio by both methods. However, in Fig. 6(c), the intensity troughs, which refer to the color fringes on object 2, are separated while the intensity peaks, which refer to the blank areas on object 2, are quite close to each other. This is particularly apparent for three intensity troughs in the right part of Fig. 6(c), which represent three black fringes. The above analyses indicate that the image of object 2 obtained from the DOF-extending method has a higher contrast ratio and sharper edges than that obtained from the conventional method, and thus, it is more faithful to the shape of the original object. This can be observed even more obviously in Fig. 6(d). Thus, we can come to a conclusion that the DOF is evidently enhanced by amplitude modulating.

Fig. 6

(a) Intensity sampling path shown with dashed arrow on the object and normalized intensity profiles of (b) object 1, (c) object 2, and (d) object 3 for the two enlarged elemental images shown in Fig. 5.

OE_54_7_073108_f006.png

After obtaining the EIAs, we conducted a computational reconstruction experiment and the reconstructed images obtained at different virtual imaging planes, which were 12.4 mm, 49.4 mm and 85.4 mm away from the original reference plane of the sensor, were shown in Fig. 7. These reconstructed images have been slightly shifted with respect to their theoretical positions due to the experimental errors introduced in the optical pickup process. From the results of the conventional method in Fig. 7(b), we can see that the reconstructed image of object 1 looks quite clear since it locates on the focusing plane of the sensor, object 2 starts getting blurred since it locates on the marginal depth plane, and object 3 is too blurry to be observed since it is located out of the depth range. By contrast, the image of object 3 in Fig. 7(a) is almost as clear as the image of object 2 in Fig. 7(b); therefore, the DOF-extending method has successfully moved the marginal depth plane from object 2 to object 3, which means that the DOF is increased from around 30.0 to 60.0 mm as estimated in Fig. 2 and Sec. 3. What is more, as shown in Fig. 8, normalized intensity profiles of the reconstructed images were obtained by using the same method in Fig. 6. It can be seen that the intensity distributions of these reconstructed images are quite similar to those shown in Fig. 6, and this indicates that the reconstructed images obtained from the DOF-extending method are clearer than those obtained from the conventional method with sharper edges and higher contrast ratio. Thus, the effectiveness of the DOF extending method was finally confirmed. Note that in the experiment, the DOF is enhanced at the expense of losing the light efficiency by a factor of about 49.6%. Therefore, people should be careful when trying to apply this DOF-extending method to situations where light efficiency is highly desired.

Fig. 7

Computational reconstruction results obtained at different virtual imaging planes: (a) the DOF-extending method and (b) the conventional method.

OE_54_7_073108_f007.png

Fig. 8

Normalized intensity profiles for the reconstructed images of (a) object 1, (b) object 2, and (c) object 3.

OE_54_7_073108_f008.png

It is noteworthy that the DOF-extending method has a better performance in recording the high-frequency components of the object information due to its bandpass characteristics. Generally, for a 3-D scene with a small depth range, people pay more attention to the details of the 3-D object, which are mostly resolved by the high-frequency information.23 While for a 3-D scene with a large depth range, the optical transfer function of the DOF-extending method will suffer a severe attenuation and oscillation, which will seriously degrade the image quality.24 Therefore, the DOF-extending pickup method is more suitable for enhancing the DOF of a 3-D scene with a small depth range. After performing the experiment using different 3-D objects with different depth ranges for many times, we find that 60.0 mm [shown in Fig. 4(c)] is almost the largest depth range that can be adopted to demonstrate the effectiveness of the DOF-extending method for the given pickup system shown in Fig. 3. The deeper the 3-D object is, the less effective the DOF-extending method will be.

5.

Conclusion

We have analyzed the light intensity distributions and propagation characteristics of the DOF extending and conventional integral-imaging pickup process. Experimental results of the optical pickup process and the computational reconstruction process have shown that the DOF-extending method works effectively when recording a 3-D scene with a small depth range. Note that in the optical pickup experiment, the DOF is enhanced at the expense of losing the light efficiency. Therefore, people should be careful when trying to apply this DOF-extending method to situations where light efficiency is highly concerned. Also, this method is currently difficult to apply to MLAs for optical pickup or reconstruction due to the limited aperture of each microlens, but with time, it would be possible for manufacturers to fabricate a kind of amplitude-modulating mask onto MLAs. In our future work, DOF-extending methods for recording and displaying 3-D scenes with large depth ranges will be presented.

Acknowledgments

This work is supported by the “973” Program under Grant No. 2013CB328802, the NSFC under Grant Nos. 61225022 and 61320106015, and the “863” Program under Grant Nos. 2012AA011901 and 2012AA03A301. The authors would like to thank Prof. Manuel Martínez-Corral, Prof. Bahram Javidi, and Dr. Xiao Xiao for valuable suggestions on designing the optical pickup experiment.

References

1. 

G. Lippmann, “La photographie integrale,” Comptes-Rendus Acad. Sci., 146 446 –451 (1908). Google Scholar

2. 

J. Y. Son et al., “Recent developments in 3-D imaging technologies,” J. Disp. Technol., 6 (10), 394 –403 (2010). http://dx.doi.org/10.1109/JDT.2010.2045636 IJDTAL 1551-319X Google Scholar

3. 

Y. Takaki, K. Tanaka and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express, 19 (5), 4129 –4139 (2011). http://dx.doi.org/10.1364/OE.19.004129 OPEXFF 1094-4087 Google Scholar

4. 

G. L. Xue et al., “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express, 22 (15), 18473 –18482 (2013). http://dx.doi.org/10.1364/OE.22.018473 OPEXFF 1094-4087 Google Scholar

5. 

X. Li et al., “Video-rate holographic display using azo-dye-doped liquid crystal,” J. Disp. Technol., 10 (6), 438 –443 (2014). http://dx.doi.org/10.1109/JDT.2013.2281918 IJDTAL 1551-319X Google Scholar

6. 

Y. P. Huang, P. Y. Hsieh and S. T. Wu, “Applications of multidirectional asymmetrical microlens-array light-control films on reflective liquid-crystal displays for image quality enhancement,” Appl. Opt., 43 (18), 3656 –3663 (2004). http://dx.doi.org/10.1364/AO.43.003656 APOPAI 0003-6935 Google Scholar

7. 

X. Xiao et al., “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt., 52 (4), 546 –560 (2013). http://dx.doi.org/10.1364/AO.52.000546 APOPAI 0003-6935 Google Scholar

8. 

L. Q. Zhou et al., “Voxel model for evaluation of a three-dimensional display and reconstruction in integral imaging,” Opt. Lett., 39 (7), 2032 –2035 (2014). http://dx.doi.org/10.1364/OL.39.002032 OPLEDP 0146-9592 Google Scholar

9. 

F. Wu et al., “High-optical-efficiency integral imaging display based on gradient-aperture pinhole array,” Opt. Eng., 52 (5), 054002 (2013). http://dx.doi.org/10.1117/1.OE.52.5.054002 Google Scholar

10. 

J. L. Zhang et al., “Feasibility study for pseudoscopic problem in integral imaging using negative refractive index materials,” Opt. Express, 22 (17), 20757 –20769 (2014). http://dx.doi.org/10.1364/OE.22.020757 OPEXFF 1094-4087 Google Scholar

11. 

J. Yim, Y. M. Kim and S. W. Min, “Real object pickup method for real and virtual modes of integral imaging,” Opt. Eng., 53 (7), 073109 (2014). http://dx.doi.org/10.1117/1.OE.53.7.073109 Google Scholar

12. 

Y. T. Lim et al., “Analysis on enhanced depth of field for integral imaging microscope,” Opt. Express, 20 (21), 23480 –23488 (2012). http://dx.doi.org/10.1364/OE.20.023480 OPEXFF 1094-4087 Google Scholar

13. 

S. K. Kim et al., “Evaluation of the monocular depth cue in 3D displays,” Opt. Express, 16 (26), 21415 –21422 (2008). http://dx.doi.org/10.1364/OE.16.021415 OPEXFF 1094-4087 Google Scholar

14. 

J. H. Park et al., “Depth-enhanced three-dimensional-two-dimensional convertible display based on modified integral imaging,” Opt. Lett., 29 (23), 2734 –2736 (2004). http://dx.doi.org/10.1364/OL.29.002734 OPLEDP 0146-9592 Google Scholar

15. 

J. S. Jang and B. Javidi, “Large depth-of-focus time-multiplexed three-dimensional integral imaging by use of lenslets with nonuniform focal lengths and aperture sizes,” Opt. Lett., 28 (20), 1924 –1926 (2003). http://dx.doi.org/10.1364/OL.28.001924 OPLEDP 0146-9592 Google Scholar

16. 

C. K. Park, S. S. Lee and Y. S. Hwang, “Depth-extended integral imaging system based on a birefringence lens array providing polarization switchable focal lengths,” Opt. Express, 17 (21), 19047 –19054 (2009). http://dx.doi.org/10.1364/OE.17.019047 OPEXFF 1094-4087 Google Scholar

17. 

M. Born and E. Wolf, Principles of Optics, 461 –465 7th ed.Cambridge University, Cambridge, United Kingdom (1999). Google Scholar

18. 

G. C. Steward, The Symmetrical Optical System, 88 –102 Cambridge University, London, United Kingdom (1958). Google Scholar

19. 

W. T. Welford, “Use of annular apertures to increase focal depth,” J. Opt. Soc. Am. A, 50 (8), 749 –752 (1960). http://dx.doi.org/10.1364/JOSA.50.000749 JOAOD6 0740-3232 Google Scholar

20. 

M. Martínez-Corral et al., “Integral imaging with improved depth of field by use of amplitude-modulated microlens arrays,” Appl. Opt., 43 (31), 5806 –5813 (2004). http://dx.doi.org/10.1364/AO.43.005806 APOPAI 0003-6935 Google Scholar

21. 

C. G. Luo et al., “Analysis of the depth of field of integral imaging displays based on wave optics,” Opt. Express, 21 (25), 31263 –31273 (2013). http://dx.doi.org/10.1364/OE.21.031263 OPEXFF 1094-4087 Google Scholar

22. 

M. Born and E. Wolf, Principles of Optics, 261 –263 7th ed.Cambridge University, Cambridge, United Kingdom (1999). Google Scholar

23. 

M. Mino and Y. Okano, “Improvement in the OTF of a defocused optical system through the use of shaded apertures,” Appl. Opt., 10 (10), 2219 –2225 (1971). http://dx.doi.org/10.1364/AO.10.002219 APOPAI 0003-6935 Google Scholar

24. 

E. L. O’Neill, “Transfer function for an annular aperture,” J. Opt. Soc. Am. A, 46 (4), 285 –288 (1956). http://dx.doi.org/10.1364/JOSA.46.000285 JOSAAH 0030-3941 Google Scholar

Biography

Cheng-Gao Luo is currently pursuing his PhD in optical engineering at Sichuan University, Chengdu, China. He worked as a visiting research scholar at the University of Connecticut from 2012 to 2013. His recent research interest is information display technologies including 3-D displays.

Qiong-Hua Wang is a professor of optics at the School of Electronics and Information Engineering, Sichuan University, China. She was a postdoctoral research fellow at the School of Optics/CREOL, University of Central Florida, from 2001 to 2004. She has published more than 200 papers on information displays. She is the associate editor of Optics Express and Journal of the Society for Information Display. Her recent research interests include optics and optoelectronics, especially display technologies.

Huan Deng is a lecturer of optics at the School of Electronic and Information Engineering, Sichuan University. She received her PhD from Sichuan University in 2012. She has published more than 10 papers. She is a member of Society for Information Display. Her recent research interest is information display technologies including 3-D displays.

Yao Liu is currently working toward her MS degree in optical engineering at the School of Electronics and Information Engineering, Sichuan University, Chengdu, China. Her current research interest is information display technologies including 3-D displays.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Cheng-Gao Luo, Qiong-Hua Wang, Huan Deng, and Yao Liu "Extended depth-of-field in integral-imaging pickup process based on amplitude-modulated sensor arrays," Optical Engineering 54(7), 073108 (23 July 2015). https://doi.org/10.1117/1.OE.54.7.073108
Published: 23 July 2015
Lens.org Logo
CITATIONS
Cited by 6 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Sensors

3D image processing

3D displays

Image resolution

Integral imaging

Opacity

Optical engineering

RELATED CONTENT


Back to Top