Open Access
5 January 2021 Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs
Author Affiliations +
Abstract

Optimal optical transfer function (OTF) is proposed for RGB inverse imaging with extended depth of field (DoF) and diminished color aberrations. This optimal OTF is derived as the Wiener filter of broadband defocused OTFs. The performance of this new inverse imaging is demonstrated for optical setups with lens and lensless with a multilevel phase mask instead of the lens. The later lensless system designed for the wavelength range (400 to 700) nm and DoF range (0.5 to 1000) m demonstrates the best performance.

1.

Introduction

All-in-focus imaging is one of the key problems for imaging systems. Roughly speaking, there are two alternative approaches to this problem. The first one is based on optical systems insensitive to distance between object and camera what automatically enables extended depth of field (DoF) of imaging. In classical optics and engineering, it is achieved by design of complex, bulky, and expensive multilens objectives. In a recent development, the wavefront coding (WC) supported by the inverse imaging algorithm opens the door for solutions that are more compact, have a lighter weight, and are lower in cost compared with conventional refractive optics systems, e.g., Ref. 1. The alternative vision of the problem is based on optical systems highly sensitive to variations of distance object/camera. This distance is estimated and the estimates are used for refocusing of registered data to achieve the all-in-focus result, e.g., Ref. 2.

In this paper, we follow the ideas of the first approach and focus on the design of a lensless optical system where the diffractive optical element (DOE), in particular, a planar multilevel phase mask (MPM), is used instead of a refractive lens. A flow of recent publications demonstrates that DOEs could be quite efficient for various imaging applications. However, especially in broadband optical light systems, using DOE is a challenging task causing serious color aberration problems because DOEs usually have a strong material dispersion.

This paper is aimed at the end-to-end joint design of DOE and the imaging software for lensless systems concerning two tightly connected problems: extended DoF and achromatic imaging. The paper is organized as follows. In Sec. 2, the mathematical models for image formation and inverse imaging are presented. The main result is formulated as an optimal OTF for inverse imaging with extended DoF. The proof of this result is given in the Appendix. Simulation tests demonstrating the achieved DoF and achromatic imaging and comparison with a few counterparts are discussed in Sec. 3.

1.1.

Related Work

Many recent publications are focused on enhancing the capability of cameras concerning DoF and chromatic aberrations by the use of DOEs. For these purposes, DOEs parameterized by Zernike polynomials are demonstrated with end-to-end optimization in super-resolution imaging.1 As the end-to-end optimization of optical systems is usually time-consuming, it is feasible to decrease the computational time significantly with graphical processing units instead of CPU.36

A thin scattering diffuser introduced for single-shot imaging in Ref. 7 provides extended DoF in both microscopic and single-lens imaging systems. In addition, using diffusion coded8 and random phase plate diffusers9,10 demonstrate a good imaging performance.

A multilevel diffractive lens as a replacement for a spherical refractive lens is studied for imaging in Refs. 11 to 16. Moreover, in Ref. 17, the coded mask is presented along with a convolutional neural network-based optimization for extending DoF. Also, the Jacobi–Fourier phase mask for imaging is presented in Ref. 18. An optimized binary phase mask for a hybrid optical digital imaging system is utilized to improve DoF in Refs. 19 to 21.

Going further, combining two Fresnel lenses with cubic components is proposed to increase DoF and robustness of imaging to additive noise due to fabrication errors in Fresnel lenses.22 A designing of lens and MPM together as a hybrid optics system for high-quality broadband imaging is studied in Ref. 23. The optical power sharing between the lens and MPM is optimized in this paper. Broadband and multiwavelength lensless diffractive imaging optical system is optimized for imaging in Ref. 24.

A set of mask patterns implemented on a programmable device as well as a fast computational image reconstruction algorithm are proposed for a lensless setup that results in imaging efficient for large depth variations.25 A hyperspectral DOE imaging system with a end-to-end optimized hyperspectral-depth estimation is demonstrated in Ref. 26.

1.2.

Contribution

In this paper, we present a development of the approach proposed in Ref. 24. The main contribution concerns the inverse imaging algorithm with the transfer function derived as the Wiener filtering of defocused OTFs. Jointly with broadband PSFs, it results in an essential extension of DoF and essentially improved achromatic imaging. The paper is targeted on the lensless imaging with end-to-end optimization of both the diffractive optical MPM and the inverse imaging algorithm. Good accuracy and good visual imaging performance are confirmed by extensive simulation experiments.

2.

Image Formation and Inverse Imaging

2.1.

Image Formation Models

In Fourier wave optics, image formation is modeled as convolution of the system’s PSF and true object-image. The optical setup is shown in Fig. 1: object, aperture, and sensor are two-dimensional (2D) flat with coordinates (ξ,η), (x,y), and (u,v), respectively; d1 is a distance between the object and the aperture, d2 is a distance from the aperture to the sensor (d2d1); f0 is a focal distance of the optics. Let us assume that there are both lens and MPM in the aperture, then a generalized pupil function Pλ of the system shown in Fig. 1 is of the form:27

Eq. (1)

Pλ(x,y)=PA(x,y)exp[jπλ(1d1+1d21fλ)(x2+y2)+jMPMλ0,λ(x,y)].
Here, fλ is a lens focal distance for the wavelength λ, PA(x,y) is the aperture of the optics, and MPMλ0,λ(x,y) is the phase delay enabled by MPM for the wavelength λ provided that λ0 is the wavelength design-parameter for MPM. In this equation, the phase jπλ(1d1+1d2)(x2+y2) appears due to propagation of the coherent wavefront from the object to the aperture (distance d1) and from the aperture to the sensor plane (distance d2), and jπλfλ(x2+y2) is a quadratic phase delay due to the lens. For the lensless system,

Eq. (2)

Pλ(x,y)=PA(x,y)exp[jπλ(1d1+1d2)(x2+y2)+jMPMλ0,λ(x,y)],
and for the lens system without MPM, MPMλ0,λ(x,y)0 in Eq. (1).

Fig. 1

Wave optics based PSF modeling. A light wave with a given wavelength λ propagates from an object plane to an aperture (Fresnel propagation, distance d1) and further to a sensor plane (distance d2). Optical elements are simple lens or MPM in the aperture.

OE_60_5_051204_f001.png

The PSF of the coherent monochromatic optical system is defined according to the equation:27

Eq. (3)

PSFλcoh(u,v)=FPλ(ud2λ,vd2λ),
where FPλ is the Fourier transform of Pλ. The normalized PSF for the corresponding incoherent system is calculated as

Eq. (4)

PSFλ(u,v)=|PSFλcoh(u,v)|2/|PSFλcoh(u,v)|2dudv.

Optical transfer function (OTF) is the Fourier transform of PSFλ(u,v):

Eq. (5)

OTFλ(fX,fY)=PSFλ(u,v)exp[j2π(fXu+fYv)]dudv.

In spectral broadband multicolor imaging, the intensity registered by c-band channel is an integration of the monochromatic intensity over the wavelength range Λ with the weights Tc(λ) defined by the sensitivity transmissive functions of color filter array (CFA) and spectral response of the sensor. With normalizing these sensitivities on λ, i.e., ΛTc(λ)dλ=1, we obtain RGB channels PSFs:

Eq. (6)

PSFc(x,y)=ΛPSFλ(x,y)Tc(λ)dλ,c(r,g,b),
where the monochromatic PSFλ is averaged over λ with the weights Tc(λ).

We wish to emphasize a difference between monochromatic PSFλ(x,y) with fixed λ and broadband RGB PSFc(x,y), c(r,g,b). Contrary to conventional approaches for PSF-based RGB imaging, which use Eq. (4) with three wavelengths λ (often, 450, 550, and 650 nm), we perform dense spectral sampling of 31 spectral channels from 400 to 700 nm in 10 nm intervals.24 This is essential, in particular, for accurate image formation modeling.

2.2.

Inverse Imaging for Defocused Observations

Let us introduce PSFs for defocus scenarios with notation PSFλ,δ(x,y) and PSFc,δ(x,y), where δ is a defocus distance in d1, such that d1=d10+δ with d10 equal to the focal distance between the aperture and the object. Introduce a set D of defocus values δD defining area of the desirable improved DoF. It is worth noting that the corresponding OTFs are used with notation OTFλ,δ(fX,fY) and OTFc,δ(fX,fY). The definition of OTFc,δ(fX,fY) corresponds to Eq. (5).

Let Ic,δs(u,v) and Ico(u,v) be the intensities of the wavefronts at the sensor (registered focused/misfocused images) and the intensity of the object (true image), respectively. Then, Ic,δs(u,v) are obtained by convolving the true object-image Ico(u,v) with PSF [PSFc,δ(u,v)] of the system:

Eq. (7)

Ic,δs(u,v)=PSFc,δ(ux,vy)Ico(x,y)dxdy,
and in the Fourier domain this equation takes the form:

Eq. (8)

Ic,δs(fX,fY)=OTFc,δ(fX,fY)·Ico(fX,fY),
where Ic,δs(fX,fY) and Ico(fX,fY) are the Fourier transform of Ic,δs(u,v) and Ico(u,v) and “·” stays for elementwise multiplication of the corresponding Fourier transforms.

For image reconstruction from the blurred data Ic,δs, δD, we wish to use the filter with a single transfer function Hc, which is same for all δ.

Let us formulate the design of Hc as an optimization problem in the Fourier domain

Eq. (9)

H^c=minHck,δDωδ·Ico,kHc·Ic,δs,k22,
where k stays for different images from a given library and ωδ are weights depending on the misfocus parameter δ. For simplicity of notation, the arguments of the Fourier transform (fX,fY) are omitted in this and in the following equations.

Note that Ico,k and Ic,δs,k correspond to the sets of the true and observed blurred images with items marked by the index k for different images and by c for color channels. Moreover, the norm ·22 is Euclidean defined for the complex-valued variables in the Fourier domain. Thus, we wish to find Hc such that the corresponding estimates (Hc·Ic,δs,k) would be close to FT of the corresponding true images Ico,k. The weights ωδ define the relative value of the different δ values. Due to the definition of Eq. (9), H^c is a Fourier domain Wiener filter for the defocused (blurred) observations Ic,δs,k.

STATEMENT 1: Fourier domain Wiener filter

Provided the observation Eq. (8), the solution of the problem Eq. (9) has a form

Eq. (10)

H^c=δDωδ·OTFc,δ*δDωδ·|OTFc,δ|2,
assuming that k|Ico,k(fX,fY)|2>0 for all fX, fY. Here, (*) stay for complex conjugate variables.

The proof of Eq. (10) is given in the Appendix. It is emphasized that this solution does not depend on test images Ico,k.

The regularized form of this Wiener filter as it used in our calculations is

Eq. (11)

H^c=δDωδ·OTFc,δ*δDωδ·|OTFc,δ|2+reg,
where the regularization parameter is reg>0.

The reconstructed images are calculated as

Eq. (12)

I^c,δo,k(x,y)=F1{H^c·Ic,δs,k}.
Let the weight ωδ be exponential ωδ=exp(μ·|δ|), where μ>0 is a parameter.

For μ=0

Eq. (13)

H^c=δDOTFc,δ*δD|OTFc,δ|2+reg,
for μ

Eq. (14)

H^c=OTFc,0*|OTFc,0|2+reg.

In the case of Eq. (13), Hc is the mixed OTF composed of PSFs corresponding to the multiple defocuses δ from the area of the desirable DoF D taken with equal weights. The latter case Eq. (14) means that for inverse (deblurring) imaging we use the broadband OTF corresponding to the focal point with δ=0 only. It coincides with the inverse imaging used in Ref. 24.

We compare the performance of the proposed mixed OTFs [Eq. (13)] versus two counterparts: (1) with H^c defined as in Eq. (14), what corresponds to the focused broadband inverse imaging; (2) inverse imaging with the monochromatic PSFs defined in Eqs. (4) and (5) with λ taking values corresponding RGB channels. This latter type of inverse imaging is common for many publications.

In order to make a difference between the estimates with transfer functions Eqs. (13) and (14), we will term them as “broadband Wiener filter” (BBWF) and “broadband focused filter” (BBFF), respectively.

The introduction of the mixed OTFs as in Eq. (10) is one of the main results of this paper. We show that the mixed Wiener OTFs allow significantly improve DoF and diminish color aberrations. Note, that this mixed H^c depend on the weights ωδ, the parameter μ for the exponential weights, and the regularization parameter reg. Optimization of these two variables is crucial for this kind of imaging.

Besides, we compare the performance of the designed MPM versus the optical system with a simple lens both equipped by inverse imaging with BBWF and BBFF transfer functions Hc.

We use peak signal-to-noise ratio (PSNR) as the criterion for evaluation of imaging accuracy. It is used in two versions: chromatic PSNRc, c(r,g,b), where PSNR is calculated for each of the color channels separately, and the “total” PSNR without index c, where the accuracy is calculated for all three color channels jointly.

These calculations are produced for different δD to evaluate the accuracy of imaging for varying defocus.

Let Θ be a full set of parameters to be optimized, then we calculate PSNR(Θ,δt). Thus, optimization on Θ appeared as a multiobjective problem with N loss-functions PSNR(Θ,δt), t=1,,N:

Eq. (15)

Θ^=maxΘ[PSNR(Θ,δ1),,PSNR(Θ,δN)],
where we calculate PSNRs as mean values over the set of images, kK:

Eq. (16)

PSNR(Θ,δ)=meankK[PSNR(I^c,δo,k,Ico,k)].

Multiobjective optimization problems are often solved using scalarization techniques which means that the loss functions are aggregated (or reformulated as constraints), and then a constrained single-objective problem is solved.28

In here, we produce this multiobjective optimization comparing the curves PSNR(Θ,δt), t=1,,N for various Θ trying to maximize each of PSNR(Θ,δt).

We assume that this maximization is successful for δt if PSNR(Θ^,δt)>PSNRthresh, where PSNRthresh is a prescribed value guaranteeing high-quality imaging. In our calculations, PSNRthresh is equal to 25 dB.

The achieved length of DoF is defined as a length of D covered by successful PSNR(Θ^,δl). It is clear that a solution of the problem is not unique in this formulation.

The optimization of the total PSNR is a topic of Sec. 3.2, where we demonstrate the improvement achieved for DoF.

The optimization of the spectral PSNRc is a topic of Sec. 3.3. We are targeted on color aberration problem and demonstrate the improvements achieved in color imaging for r, g, b image components. The optimization is produced for the spectral regularization parameters regc, c(r,g,b). A bit surprisingly, this optimization gives to found optimal regc quite close values.

3.

Results

3.1.

Simulation Experiments and Parameters of Optical Systems

The design of inverse imaging with a flat compact MPM is the aim of this paper. General features of the system modeling (excluding PSF discussed in Sec. 2.1) are shown in Fig. 2. As a counterpart, we consider a camera with a simple single refractive lens, which is convenient for modeling due to a small number of parameters characterizing its optics.

Fig. 2

The framework of image formation modeling including the optical element (MPM), sensor, algorithms for mosaicing/demosaicing, and calculation of loss functions; some optimization parameters and exploited RGB data set are shown.

OE_60_5_051204_f002.png

The optical systems with lens and MPM have the same configuration and same geometrical and optical parameters: a diameter of aperture PA is equal to 6 mm; a distance between the aperture and the sensor plane is d2=10  mm; a distance between the object and the aperture is varying d1D=[0.5:1000]  m; the focal distance of the lens is equal to f=10  mm, then the in-focus point is d10=1  m.

The Bayer color sensor 4096×4096 with pixels 1.55  μm has spectral response functions defined according to Sony IMX172 Bayer CFA. For RGB monochromatic imaging defined by PSFs shown in Eq. (4), the wavelengths λ take values [470;525;640]  nm. For RGB broadband imaging, PSFs are defined by Eq. (6) with λ[400  to  700  nm]. This wavelength interval is discretized on 31 bands of the width 10 nm.

For demosaicing, we use the residual interpolation algorithm proposed as an alternative to the conventional color difference interpolation.29

Our goal is to compare the three types of OTFs (PSFs) used for inverse imaging: (1) the proposed BBWF [Eq. (13)]; (2) the BBFF [Eq. (14)] focused on d1=d10 (defocus parameter δ=0); (3) monochromatic focused filter (MFF) as it is defined by Eq. (4) and focused as above for d1=d10. The observations in all cases are obtained according to Eq. (8) corresponding to the broadband PSFs.

The design of MPM is produced following the technique developed in Ref. 24. The design wavelength is fixed at λ0=525  nm; the absolute phase calculated as the sum of the quadratic phase corresponding to the focal length (fλ0) plus the cubic summand of magnitude a3; this absolute phase is wrapped to the interval of FR·[π,π), where FR is a Fresnel number; after that, the wrapped phase is discretized to Nlevel=30 levels. The parameters of this design, a3, FR, and Nlevel jointly with μ and reg are used for end-to-end optimization of PSNRs.

As discussed above, we use maximization of PSNRs and the introduced Wiener filtering (BBWF) to achieve improved DoF. It is demonstrated in what follows that this design automatically enables improved achromatic imaging. As the number of parameters to be optimized is not large, we use the optimization on grid for good initialization of estimates which are optimized by the MATLAB fast code fminsearch. An example of MPM obtained in this way with a3=29, FR=16, and Nlevel=30.3 is shown in Fig. 3.

Fig. 3

A profile of the designed MPM: (a) view from the top and (b) the cross-section of the central part of MPM. This MPM is different from the standard Fresnel lens due to the cubic component in the surface and the Fresnel number of FR=16.

OE_60_5_051204_f003.png

3.2.

Extended DoF

To be targeted on improvement in DoF in this section, we show the results obtained by optimization of the “total” PSNR, calculated without separation of the color image channels. As it is shown in Fig. 2, the main optimization parameters are: for simple lens system (μ and reg) and lensless system with MPM (a3, FR, Nlevel, μ, and reg). The final results, PSNR curves as a function of d1, are shown in Fig. 4.

Fig. 4

PSNR curves of the optimized lens and lensless setups as a function of distance from the scene to the sensor plane (d1). The three types of PSFs (OTFs) used for inverse imaging: (1) The proposed BBWF (μ=0); (2) the broadband PSF focused (BBFF, μ=), the corresponding defocus parameter of PSF is δ=0; and (3) monochromatic focused PSF [MFF, Eq. (4)] focused on d1=d10 for the simple lens system. The optimized lensless setups (BBWF and BBFF) perform in the best way with PSNR values above the crucial level 25 dB (acceptable quality line). The imaging with the lens shows good results in the close vicinity of the focal point with BBFF, μ=, and for a large range of d1 with BBWF, μ=0. Overall, the best performance is demonstrated by the lensless system with Wiener filtering (BBWF), top blue curve with circles.

OE_60_5_051204_f004.png

Recall that d1 is a distance from the object to the aperture plane. The design focal distance is d1=1  m. The reported PSNRs for each d1 are averaged over 24 Kodak RGB images. The best results are achieved by the inverse imaging using the BBWF with μ=0 for lensless setup (blue solid curve with “circles”). The performance is about 28 dB with more or less uniform PSNR values for all defocus distances in the desirable interval, δD.

The orange curve with “squares” shows the performance with the BBFF again for the lensless case. This algorithm shows the best performance for the focal point d1=1  m, which is about 2 dB better than that for BBWF at this point. However, for most other d1 points, BBWF outperforms BBFF demonstrating much more accurate results.

Three other curves in Fig. 1 are given for the simple lens system with different PSFs (OTFs) used for deblurring. This counterpart system demonstrates much lower accuracy performance as compared with the performance of the lensless system with MPM. The worse results are shown for the inverse imaging with monochromatic PSFs (MFF, red curve with stars). The peak of this curve (for the focus point) only is going above the crucial 25 dB. The fast degradation of the performance for the defocused images is clear as soon as defocus |δ| becomes larger.

It is interesting to analyze also the influence of the proposed inverse imaging with the Wiener filter (BBWF) for the accuracy achieved by the simple lens system. The best result is demonstrated using μ=0: the yellow curve with “crosses.” The PSNR values are about 25 dB for nearly the whole interval of defocus. Using the focused broadband filter (BBFF) focused on d1=1  m shows an essential improvement of the performance for the area around this focal point only with a quite narrow DoF. The worst result is demonstrated by the lens system using monochromatic focused PSF (MFF) for inverse imaging. Overall, we may conclude that both the broadband OTFs and the Wiener filtering essentially are able to improve the performance of the lens system.

The “parrot” images in Fig. 5 illustrate a visual performance of the compared systems and algorithms. The first two rows 1 and 2 of the figure show the results for the lens system. Furthermore, rows 3 and 4 are for the lensless setup with MPM. The columns are given for different d1. If we compare the images in rows, the best result is always for the middle in-focus column. However, if we produce this comparison row-by-row using all three images of rows simultaneously, we may note that overall the best result is given by the algorithm based on Wiener filtering of OTFs (BBWF), the row 4, what is in the complete agreement with Fig. 4 and the above discussion concerning this figure.

Fig. 5

Visual performance of the optimized lens and lensless optical setups with two different deblurring algorithms for comparison. The proposed BBWF [Eq. (13)] and Broadband PSF focused on d1=d10 [BBFF, Eq. (14)] are exploited for both setups. The columns are given for different distances d1. The first two rows are for the system with the lens and the second pair of the rows for the lensless system. Comparing the results visually and by PSNRs, we may conclude that the best result is given by the lens system for the focal distance d1=1. However, for defocus situations, the performance of the lensless system is advanced. Overall, the comparison corresponds to the conclusion following from the analysis of PSNRs in Fig. 4, the best performance is achieved by the proposed MPM system with the Wiener inverse filtering (BBWF): the last row of the table.

OE_60_5_051204_f005.png

The longitudinal cross-sections of PSFs(x,y,d1) with respect to the distance d1 are used for demonstration, in particular, extended DoF effects (e.g., Refs. 1 and 24). Note, that these focusing PSFs define deblurring and blurred images. In this paper, we analyze extended DoF and achromatic effects being based on “PSF after deblurring.”

Let us introduce this approach and the new PSFs. The deblurred image in Fourier domain according to Eq. (13) is defined as I^c,δo,k=H^c·Ic,δs,k=(H^c·OTFc,δ)·Ico,k. Then, the deblurred image I^c,δo,k(x,y) can be written as the convolution of the true image Ico,k(x,y) with the PSF calculated as

Eq. (17)

PSF-ADc,δ(x,y)=real[F1(H^c·OTFc,δ)].
To make a difference with PSFs used for focusing, let us call the function defined by this equation as PSF-after-deblurring (PSF-AD). It is clear that the reconstructed image is a convolution of this PSF-AD with the true image. This convolution and this PSF-AD are much more informative to represent the degradation of the reconstructed image as compared with what can be obtained using the conventional focusing PSF for these purposes.

Figure 6 shows the longitudinal cross-sections of PSF-ADs for the green bandwidth for four different cases. Figures 6(a) and 6(b) show the cross-sections for the lens system with monochromatic (MFF) and broadband (BBWF) inverse imaging, respectively. The PSF-AD for the monochromatic case is peakwise and very narrow, which means that DoF for this lens is very narrow. Using the broadband inverse imaging results in a much smoother cross-section, which says that DoF will be essentially wider. Figures 6(c) and 6(d) show the cross-sections for the lensless system with MPM and broadband inverse imaging. The image (c) corresponds to the focused inverse imaging (BBFF) as it is proposed in Ref. 24, and the image (d) is done for inverse imaging with Wiener filtering (BBWF). In the last case, we have a nearly flat cross-section, which says that the image quality is invariant with respect to varying d1. It explains the best quality of the lensless reconstructions with this type of inverse imaging.

Fig. 6

Longitudinal cross-sections of PSF–ADs for the green components. Images (a) and (b) are for monochromatic (MFF) and broadband (BMWF) inverse imaging in the simple lens system, respectively. Images (c) and (d) are for imaging with broadband focused (BBFF) and broad Wiener filtering (BBWF) in lensless setup.

OE_60_5_051204_f006.png

The advantage of the new proposed algorithm is clear. These conclusions on comparison of the inverse imaging methods are in complete agreement with ones given in the discussion on comparison of PSNRs curves in Fig. 4.

Some notes on how the cross-sections in this section and in what follows are calculated. Conventionally, the cross-section of PSF(x,y) is a function of x provided a fixed y, usually y=0. In this paper, the cross-section is defined as maxy[PSF-ADc,δ(x,y)]. Thus, the fixed value of y is replaced by maximization on y.

Let us comment on this definition of the cross-section. Conventionally, considering PSF as a function of three variables (x,y) and δ, a longitudinal cross-section is calculated as a 2D function obtained from PSF for y=0 (or x=0) and varying δ. This definition of the cross-section is perfectly provided that PSF is symmetric with a peak value at (x=0,y=0). In our design of MPM, the cubic component in the absolute phase makes PSF asymmetric and shifted from (x=0, y=0). PSF-AD calculated from the PSFs is also asymmetric and shifted. In our definition of the longitudinal cross-section, this unknown shift is compensated and proper peak values of PSF-AD are found by maximization of y.

3.3.

Achromatic Imaging

It is shown in the previous section that the introduced algorithm with BBWF enables extended DoF because its PSF-ADs are insensitive with respect to variations in the defocus distance d1. It leads to a natural hypothesis that achromatic imaging could be enabled by this algorithm. In this section, we illustrate and confirm this hypothesis. We show that broadband PSFs and the introduced Wiener filtering of OTFs are efficient instruments to diminish chromatic aberrations and to improve imaging in this way. To demonstrate these effects, we go from calculation of the total PSNR, where the RGB channels are mixed together to calculation of the separate channel color PSNRc, c(r,g,b). In Fig. 7, we show PSNRc as a function of d1 calculated for the lensless MPM camera. The colors of the curves indicate the corresponding RGB channels, and these PSNRs show the accuracy for the corresponding color components of the images. The legend specifies what algorithm is used for each curve. For the comparison, we show here also the “total” PSNRs, black curves in Fig. 7 as they appear in Fig. 4. We compare the performance of the two broadband inverse imaging algorithms: (1) BBWF of OTFs (Eq. (13), μ=0) and (2) broadband focused (BBFF) for d1=1 [Eq. (14), μ=]. Both algorithms demonstrate good performance with a clear advantage of the Wiener filtering similar to discussed for Fig. 4. The curves for PSNRc mainly follow the behavior of the “total” PSNR (black solid lines).

Fig. 7

Spectral performance of the optimized lensless MPM system as a function of d1. For comparison, the performance of two different inverse imaging approaches are presented: (1) broadband inverse imaging with proposed Wiener filtering of OTFs (BBWF, μ=0), and (2) broadband focused (BBFF) for d1=1 (μ=). All curves are located above the acceptable quality line 25 dB. Moreover, the overall advantage of the Wiener filtering broadband image reconstruction is clear.

OE_60_5_051204_f007.png

Thus, in Fig. 7 we have two groups of curves for spectral PSNRc, c=r,g,b. The first group is obtained using inverse imaging BBWF OTFs. These curves do not have sharp peaks, quite smooth as functions of d1, and demonstrate more or less similar high accuracy for the varying level of defocus as defined by d1. The best result is achieved for the green channel, the other red and blue channels show smaller PSNRs with values close to each other for all d1. The second group of the curves has a strong peak at the focal point d1=1  m naturally corresponding to the used inverse imaging with the broadband OTF (BBFF) focused on d1=1  m. The best accuracy is demonstrated again for the green channel. Comparing these two groups of the image reconstruction algorithms, we note that overall the best performance is demonstrated by the algorithm with Wiener filtered OTF (BBWF) with the only advantage of the algorithm with focused OTF (BBFF) for narrow area around the focal point d1=1. The PSNR curves for both methods are located above the acceptable quality line 25 dB.

Going further, we consider the cross-sections of PSF-AD for three RGB channels to make a clear source of this good spectral performance of the algorithm. In Fig. 8, we show the cross-sections of PSF-AD for RGB channels and three different d1 (0.5, 1, and 1000 m). These curves are best consolidated for the proposed inverse imaging with BBWF, which supports the conclusion that in this case, the quality of imaging is more or less same for all color channels and distances.

Fig. 8

Cross-sections of the PSF-ADs for three RGB channels and d1=[0.5;1;1000]  m. The curves are shown for the simple lens system: (a) monochromatic (MFF), (b) broadband focused (BBFF), (c) BBWF; and for the lensless system with (d) broadband focused (BBFF) and (e) BBWF inverse imaging. In the case (e), the curves are tightly consolidated with each other. It says that PSF-ADs are insensitive to variations in both the color and in the distance d1.

OE_60_5_051204_f008.png

In Fig. 9, we provide the longitudinal cross-sections for RGB channels of PSF-AD. These cross-sections are quite similar to each other for the inverse imaging using the proposed Wiener filtering of defocused OTF (BBWF) and plus to it, they have a nearly uniform distribution with respect to d1. These are the results shown in the second row of Fig. 9. In the first row, we show the corresponding results for the inverse imaging with BBFF. We may note that the advantage of the proposed algorithm with Wiener filtering of OTFs is obvious.

Fig. 9

Longitudinal cross-sections of PSF-ADs for RGB channels of the: (a)–(c) broadband focused (BBFF), and (d)–(f) proposed Wiener filtering deblurring (BBWF) in lensless system with optimized MPM as functions of different scene-to-sensor distances. In the first row, the RGB PSF-ADs are not flat for different d1. But with the proposed inverse imaging (BBWF, second row), PSF-ADs are smoothed which are nearly ideal for achromatic imaging.

OE_60_5_051204_f009.png

Special tests have been produced to evaluate a degradation of color imaging due to spectral sampling by Bayer CFA in sensor and interpolation in demosaicing procedure.29 These effects are completely eliminated if there are three separate color sensors (red, green, and blue) and RGB inverse imaging is separate for each channel. To compare the results obtained in this setup with those obtained with demosaicing, we evaluated the imaging degradation in PSNR values as 1 to 2 dB depending on image.

4.

Conclusion

An approach is developed to inverse RGB imaging with extended DoF and diminished color aberrations. The OTF for inverse imaging is derived as the Wiener filter of broadband defocused OTFs. We show that the accuracy of imaging essentially depends on the regularization parameter of OTF and weights in the criterion used for the Wiener filter derivation. We study the performance of the proposed imaging for two optical setups with lens and without lens—lensless with MPM. The later system with MPM demonstrates the best performance. The design systems are subjects of end-to-end optimization concerning the parameters of the MPM design as well as the parameters of the criterion used for derivation of the Wiener filter. As a further work, we plan to develop algorithms for adaptive evaluation of optimal regularization parameters for BBWF imaging.

5.

Appendix A: Proof of Statement 1

The optimal transfer function for inverse imaging is a solution to the problem Eq. (9):

Eq. (18)

H^c=minHck,δDωδIco,kHc·Ic,δs,k22.
In this minimization on the complex-valued Hc, the necessary minimum condition is as follows Hc*k,δDωδIco,kHc·Ic,δs,k22=0, what leads to the equation

Eq. (19)

k,δDωδ(Ico,kHc·Ic,δs,k)·Ic,δs,k*=0.

Referring to Eq. (8), insert in this equation Ic,δs,k(fX,fY)=OTFc,δ(fX,fY)·Ico,k(fX,fY), then we obtain

Eq. (20)

δDωδ(OTFc,δ*Hc·|OTFc,δ|2)·k|Ico,k|2=0.
The solution Eq. (10) follows from this equation provided that k|Ico,k|2>0.

Acknowledgments

This research was supported by Jane and Aatos Erkko Foundation, Finland, and Finland Centennial Foundation: Computational Imaging without Lens (CIWIL) project.

References

1. 

V. Sitzmann et al., “End-to-end optimization of optics and image processing for achromatic extended depth of field and super-resolution imaging,” ACM Trans. Graphics, 37 (4), 1 –13 (2018). https://doi.org/10.1145/3197517.3201333 ATGRDF 0730-0301 Google Scholar

2. 

H. Haim, A. Bronstein and E. Marom, “Computational multi-focus imaging combining sparse model with color dependent phase mask,” Opt. Express, 23 24547 –24556 (2015). https://doi.org/10.1364/OE.23.024547 Google Scholar

3. 

S. R. M. Rostami et al., “OpenACC GPU implementation of double-stage delay-multiply-and-sum algorithm: toward enhanced real-time linear-array photoacoustic tomography,” Proc. SPIE, 10878 108785C (2019). https://doi.org/10.1117/12.2511115 PSISDG 0277-786X Google Scholar

4. 

S. R. M. Rostami and M. Ghaffari-Miab, “Finite difference generated transient potentials of open-layered media by parallel computing using openmp, mpi, openacc, and cuda,” IEEE Trans. Antennas Propag., 67 (10), 6541 –6550 (2019). https://doi.org/10.1109/TAP.2019.2920253 IETPAK 0018-926X Google Scholar

5. 

S. R. M. Rostami and M. Ghaffari-Miab, “Fast computation of finite difference generated time-domain green’s functions of layered media using openacc on graphics processors,” in Iranian Conf. Electr. Eng. (ICEE), 1596 –1599 (2017). https://doi.org/10.1109/IranianCEE.2017.7985300 Google Scholar

6. 

S. R. M. Rostami et al., “GPU-accelerated double-stage delay-multiply-and-sum algorithm for fast photoacoustic tomography using led excitation and linear arrays,” Ultrasonic Imaging, 41 (5), 301 –316 (2019). https://doi.org/10.1177/0161734619862488 ULIMD4 0161-7346 Google Scholar

7. 

M. Liao et al., “Extending the depth-of-field of imaging systems with a scattering diffuser,” Sci. Rep., 9 (1), –7165 (2019). https://doi.org/10.1038/s41598-019-43593-w SRCEC3 2045-2322 Google Scholar

8. 

O. Cossairt, C. Zhou and S. Nayar, “Diffusion coded photography for extended depth of field,” ACM Trans Graphics, 29 1 –10 (2010). https://doi.org/10.1145/1778765.1778768 Google Scholar

9. 

E. E. García-Guerrero et al., “Design and fabrication of random phase diffusers for extending the depth of focus,” Opt. Express, 15 (3), 910 –923 (2007). https://doi.org/10.1364/OE.15.000910 OPEXFF 1094-4087 Google Scholar

10. 

Z. Cai et al., “Lensless light-field imaging through diffuser encoding,” Light: Sci. Appl., 9 (1), 1 –9 (2020). https://doi.org/10.1038/s41377-020-00380-x Google Scholar

11. 

S. Banerji et al., “Extreme-depth-of-focus imaging with a flat lens,” Optica, 7 (3), 214 –217 (2020). https://doi.org/10.1364/OPTICA.384164 Google Scholar

12. 

S. Banerji and B. Sensale-Rodriguez, “Inverse designed achromatic flat lens operating in the ultraviolet,” OSA Continuum, 3 (7), 1917 –1929 (2020). https://doi.org/10.1364/OSAC.395767 Google Scholar

13. 

S. Banerji et al., “Imaging with flat optics: metalenses or diffractive lenses?,” Optica, 6 (6), 805 –810 (2019). https://doi.org/10.1364/OPTICA.6.000805 Google Scholar

14. 

M. Meem et al., “Imaging from the visible to the longwave infrared wavelengths via an inverse-designed flat lens,” (2020). Google Scholar

15. 

Y. Peng et al., “The diffractive achromat full spectrum computational imaging with diffractive optics,” ACM Trans. Graphics, 35 1 –11 (2016). https://doi.org/10.1145/2897824.2925941 ATGRDF 0730-0301 Google Scholar

16. 

N. Mohammad et al., “Broadband imaging with one planar diffractive lens,” Sci. Rep., 8 (1), 1 –6 (2018). https://doi.org/10.1038/s41598-018-21169-4 SRCEC3 2045-2322 Google Scholar

17. 

U. Akpinar, E. Sahin and A. Gotchev, “Learning optimal phase-coded aperture for depth of field extension,” in IEEE Int. Conf. Image Process. (ICIP), 4315 –4319 (2019). https://doi.org/10.1109/ICIP.2019.8803419 Google Scholar

18. 

E. González-Amador et al., “Jacobi–Fourier phase mask for wavefront coding,” Opt. Lasers Eng., 126 105880 (2020). https://doi.org/10.1016/j.optlaseng.2019.105880 Google Scholar

19. 

H. Haim, A. Bronstein and E. Marom, “Multi-focus imaging using optical phase mask,” in Comput. Opt. Sens. and Imaging, CTh2C-6 (2014). Google Scholar

20. 

A. Fontbonne et al., “Experimental validation of hybrid optical–digital imaging system for extended depth-of-field based on co-optimized binary phase masks,” Opt. Eng., 58 (11), 113107 (2019). https://doi.org/10.1117/1.OE.58.11.113107 Google Scholar

21. 

J. Shin et al., “A minimally invasive lens-free computational microendoscope,” Sci. Adv., 5 (12), eaaw5595 (2019). https://doi.org/10.1126/sciadv.aaw5595 STAMCV 1468-6996 Google Scholar

22. 

M. Ponomarenko, V. Katkovnik and K. Egiazarian, “Phase masks optimization for broadband diffractive imaging,” Electron. Imaging, 2019 (11), 258-1 –258-6 (2019). https://doi.org/10.2352/ISSN.2470-1173.2019.11.IPAS-258 ELIMEX Google Scholar

23. 

V. Katkovnik, M. Ponomarenko and K. Egiazarian, “Optimization of hybrid optics with multilevel phase mask for improved depth of focus broadband imaging,” in 7th Eur. Workshop Vis. Inf. Process. (EUVIP), 1 –6 (2018). https://doi.org/10.1109/EUVIP.2018.8611767 Google Scholar

24. 

V. Katkovnik, M. Ponomarenko and K. Egiazarian, “Lensless broadband diffractive imaging with improved depth of focus: wavefront modulation by multilevel phase masks,” J. Mod. Opt., 66 (3), 335 –352 (2019). https://doi.org/10.1080/09500340.2018.1526344 JMOPEW 0950-0340 Google Scholar

25. 

Y. Hua et al., “Sweepcam—depth-aware lensless imaging using programmable masks,” IEEE Trans. Pattern Anal. Mach. Intell., 42 (7), 1606 –1617 (2020). https://doi.org/10.1109/TPAMI.2020.2986784 ITPIDJ 0162-8828 Google Scholar

26. 

S.-H. Baek et al., “End-to-end hyperspectral-depth imaging with learned diffractive optics,” (2020). Google Scholar

27. 

J. W. Goodman, Introduction to Fourier Optics, Roberts and Company Publishers(2005). Google Scholar

28. 

M. T. Emmerich and A. H. Deutz, “A tutorial on multiobjective optimization: fundamentals and evolutionary methods,” Nat. Comput., 17 (3), 585 –609 (2018). https://doi.org/10.1007/s11047-018-9685-y NCAOAV 1567-7818 Google Scholar

29. 

D. Kiku et al., “Beyond color difference: residual interpolation for color image demosaicking,” IEEE Trans. Image Process., 25 (3), 1288 –1300 (2016). https://doi.org/10.1109/TIP.2016.2518082 IIPRE4 1057-7149 Google Scholar

Biography

Seyyed R. M. Rostami received his BSc degree in electronics engineering from Babol Noshirvani University of Technology, Babol, Iran, in 2015, and his MSc degree in electrical communication engineering from Tarbiat Modares University, Tehran, Iran, in 2017. He is currently a Phd candidate with Tampere University, Finland. He has published six journals and conference papers. His current research interests include image processing, optimization, parallel computing, signal processing, computational optics, and lensless imaging.

Vladimir Katkovnik received his PhD and DSc degrees in technical cybernetics from Leningrad Polytechnic Institute (LPI) in 1964 and 1974, respectively. From 1964 to 1991, he was an associate professor and then a professor in the Department of Mechanics and Control Processes, LPI. Since 2003, he has been with the Department of Signal Processing, Tampere University of Technology (TUT), Finland. He has published six books and over 350 refereed journal and conference papers. His research interests include stochastic image/signal processing, nonparametric estimation, computational imaging, and computational phase imaging.

Karen Egiazarian received his MSc degree from Yerevan State University in 1981, his PhD from Moscow State University, Russia, in 1986, and his DTech degree from TUT, Finland, in 1994. He is a professor leading the Computational Imaging Group, ICT faculty, Tampere University. He has authored about 650 refereed journal and conference papers. His research interests include computational imaging, sparse coding, and image and video restoration. He serves as an associate editor for the IEEE Transactions of Image Processing and is the editor-in-chief of the Journal of Electronic Imaging.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
SeyyedReza MiriRostami, Vladimir Y. Katkovnik, and Karen O. Eguiazarian "Extended DoF and achromatic inverse imaging for lens and lensless MPM camera based on Wiener filtering of defocused OTFs," Optical Engineering 60(5), 051204 (5 January 2021). https://doi.org/10.1117/1.OE.60.5.051204
Received: 30 September 2020; Accepted: 24 November 2020; Published: 5 January 2021
Lens.org Logo
CITATIONS
Cited by 6 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Optical transfer functions

Point spread functions

Imaging systems

RGB color model

Electronic filtering

Image filtering

Optical engineering

Back to Top