|
1.IntroductionOptical imaging is one of the most important tools in biomedical sciences, dating from the invention of the optical microscope. Biological tissues, from an optical point of view, are described using absorption and scattering coefficients, which also depend on the illumination wavelengths.1–6 Many attempts have been made to retrieve the shape of objects, which hide behind or within a medium that scatters light.7–10 For strong scattering media such as the soft tissue, the incoming wavefront can only propagate a short distance of about a few millimeters without being significantly scattered.4 Therefore, image quality degrades as we attempt to see deeper into the tissue.3,8 Many approaches to overcome this fundamental, yet practical, problem have been put forward over the years, in which the common theme is to deconvolve the effect of the scattering media in order to retrieve a sharp image of the object behind it. Among the suggestions were adaptive optics techniques that correct low-order aberrations using deformable mirrors or spatial light modulators. Such approaches require a bright point source (also known as “guide star”) and supporting hardware for wavefront sensing.11 More recent approaches have used controlled wavefront shaping12 that allows focusing and imaging through highly scattering samples.13 However, these techniques require either initial invasive access to both sides of the scattering media14 or a known object.9 Other approaches that are based on classical techniques from astronomy were recently suggested. These techniques employ speckle correlations of the scattering media8,10 and do not require a guide star. However, the field of view and number of emitters limit their usage for dark field imaging scenarios and may require fluorescence tagging. In addition, other methods attempt to use speckle fields averaging obtained using several projections of the object15 or multiple recorded holograms (which require coherent wave interferometry).16–19 Such techniques require specialized equipment, a well-trained practitioner for operation, and some of them are costly or require substantial scanning time and extreme stabilization constraints. In this report, we propose a visible light computational imaging technique to reveal macroscopic structures of dense tissue, such as bones embedded inside a soft tissue. Such a technique may be applied to various medical applications, which will be discussed later. The proposed apparatus is shown in Fig. 1. We base our technique on the fact that the information about the tissue’s properties can be more accurately inferred from its spatial phase distribution.20,21 The spatial phase distribution determines the scattering coefficient (). is related to the intensity attenuation due to light–tissue interaction and the anisotropy factor, (average of the cosine of the scattering angle). Often, the reduced scattering coefficient is considered. The various tissue compositions will produce different scattering properties, according to their spatial phase distribution variance. In the proposed technique, we reconstruct “macroscopic” phase structures by acquiring a wide field of view set of defocused images (through focus stack) and using them as the input to a multiplane iterative phase retrieval technique. The choice of defocusing planes is specific to the target we wish to reconstruct. The focus planes are chosen according to the object’s features by the following method: one starts with focusing on the tissue surface, and then changes the focus plane (e.g., by displacing the camera location along the imaging axis) until high frequency elements, which are not relevant for the test, are not observable (such as evidence of the soft tissue structure, small veins, and so on). This plane will be the first focus plane. Then, slightly change the focus plane until one can observe some differences between the observed images to the previous one. The differences between the images should be related to the amount of blurriness and the details of the object’s shadow. The same procedure can be used for more imaging planes. In the case in which there is not enough prior knowledge about the object, one can start with the first image focused on the top layer and choose to remove nonsignificant details later with digital postprocessing. For example, high-frequency details can be removed by a low-pass filter. Other details such as blood vessels, one can try to remove using compressive sensing. Using prior knowledge limits this technique to focusing on what one expects to observe in the reconstruction and may prevent serendipities. One should note that we do not use guide star, nor apply exact knowledge of the object and the soft tissue, since we do not wish to deconvolve the effect of the front scattering layer S2 (Fig. 1). Also, we do not attempt to exactly reconstruct the spatial phase distribution of the different tissue components. Such information may be achieved using other techniques, such as interferometric detection but requires long scanning time and a more complex setup, i.e., more sensitive to thermal and mechanical instabilities. Using the proposed technique, we demonstrate the reconstruction of the bones location through its spatial phase distribution, intentionally promoting a solution with and without the extreme high frequency features, associated with the high phase variance of the bones and surrounding tissues. In order to determine the phase from an intensity image, we employ a phase retrieval technique. We use the iterative approach by Gerchberg and Saxton,22 with no support as presented by Misell.23–25 We follow more recent work by Gur and Zalevsky,26 which showed that using more than two acquired images can improve the results and overcome the perturbing medium such as the tissue. The latter was also experimentally verified on a high contrast and low frequency object.27 In this manuscript, we report simulation and experimental results of how this technique can be used to detect bone location, where the bones are within the soft tissue, without any free space between (as tested in previous work27). In this case, one cannot simply backpropagate the signal to maintain deburring. We use both the phase and the amplitude images of the reconstructed signal in order to reveal the diffraction pattern. Once the diffraction pattern is revealed, one can calculate the original structure of the sample. 2.AlgorithmThe computational imaging technique is shown in Fig. 2. 2.1.Input n-ImagesThe algorithm inputs are images (). Each image is acquired at a different axial distance from the object to have a different defocus plane. To reduce noise, a long exposure is required which can also be achieved by acquiring several images at each distance and then averaging over time. An additional purpose of the long exposure is to overcome speckle noise. There are various techniques to overcome speckle noise. The obvious way to reduce the speckle noise is digital image processing.28–31 Other techniques use image diversity, such as polarization,32 sensor displacement,33 or rotating ground glass.34 In the case of dynamic perturbation medium, one can use the variation of time for speckle noise reduction.17 In the case in which the sample is only quasidynamic (or quasistatic) and does not support enough of light decorrelation, additional support is required with an additional time varying diffuser.18 The experimental system we used had the speckle pattern change in time due to both laser and apparatus instability, Brownian motion, dynamics of the tissues fluids, and more. The speckle pattern was only quasidynamic. In order to add to its dynamism, we added to the system, just before the sample, a simple diffuser which was unstatic and improved the time variation, and thus by allowing long exposure time, it was possible to reduce the pattern to a level of acceptable noise.17,18 The choice of time average should be set according to the system. In this experiment, it is an overall exposed time of about 1 s taken over 30 frames (at ), which produced average variability of about 2% for any additional frame to the average image (less than dark noise). 2.2.Averaging and CroppingEach acquired defocused image is obtained by averaging several frames. This step is necessary not only for speckle reduction but also we wish to keep the amount of radiation (light intensity) at safe levels, while reducing the background and detection noises. The images are then cropped in order to process the region of interest. In addition, in this stage, it is also recommended to fix spatial alignment. Multiplane phase retrieval algorithms are sensitive to lateral displacement of the detector during the axial scanning stage. Therefore, the images are digitally aligned by using the maximal correlation between two adjacent images. 2.3.Image DenoisingEach averaged image is denoised using a total-variation denoising.35 The total variation denoising suppresses background and scattering noise in the image while preserving the edges and its fine structures. We have also tested other denoise techniques, such a low-pass filter, which worked well and will be demonstrated in the simulation section. 2.4.Iterative Multiplane Phase RetrievalThe multiplane iterative phase retrieval algorithm is implemented on a set of axially defocused images.26,27 Typically, we used three axial displacements, with a distance of between adjacent planes, and 5000 iterations were used to achieve convergence. We also demonstrate that as few as two images generate satisfactory results (see Secs. 5.3 and 6). Due to the large number of iterations, the algorithm converges easily; therefore, an initial guess of zero phase was chosen arbitrarily. 2.5.EnhancementIn order to enhance the results, three simple steps are taken. First, we apply phase unwrapping36 to correct the radian phase angles by adding multiples of when absolute gaps between consecutive elements. Second, the unwrapped phase image is divided by the amplitude image, where a small number (0.01) is added to the normalized amplitude image to avoid division by zero. This step enhances the diffraction pattern by the bone area. The small value is added to the whole image and does not affect the image modulation. Its value is chosen according to the noise level of the dark areas in the images, which is 3 to 4 gray levels out of 256. And third, a standard linear contrast enhancement is applied for better visibility. The last step may not be recommended in the case of further processing. 3.Image Formation and Reconstruction of a Dense Tissue Object Obscured by Two Layers of Soft TissuesLet us review the optical wave propagation through the tissues using Fig. 1. The monochromatic uniform coherent light source illumination propagates in free space to the first soft tissue plane . The illumination meets the first soft tissue layer . We treat as a weakly scattering medium, with minimum absorption. Propagating through can be described using Kolmogorov approximation37 by free-space propagation within a medium of width , followed by a random diffusive plane with the random phase , times: where is the width of the soft tissue , is a uniformly random phase (different for every layer), and is free-space propagation for a distance described by the Fresnel equation. The field at the bones plane, before the bones, takes the form: where is the first soft tissue width. The propagating wavefront described in Eq. (2) reaches the dense tissue object structure (the “bones”). Part of the wavefront is highly scattered by the bones structure,5 while other parts of the wavefront continue to propagate with much less scattering or relatively unscattered. Approximately, it imposes two-level transfer function : where where represents the transfer coefficients of the bones. Assuming a thin soft tissue, only a minimum part of the light meets wavelength size (and smaller) elements in the tissue, which produce Rayleigh scattering, and only a small amount of the scattered light arrives at the sensor. We assume that the same applies to other scattering procedures in the tissues.The second tissue is similar to the first one, and the complex field after the second tissue is described by means of an operator on as where is the second soft tissue width and is the field after the second soft tissue layer, which now will propagate in free space to the imaging plane., the field that meets the bones, has a speckle pattern. is a speckle pattern with the two-level (binary) bone pattern. We assume that light can either pass through the soft tissue or it is blocked when it meets the bones. In this case, will have the form of a speckle pattern far from the bones, solid black at the bones area (or very dark speckle) far from the bones edge, and a diffraction pattern around the edges with a speckle pattern. As the signal continues to propagate in free space, the diffraction pattern expends and shows as a blur. In the case in which the speckle pattern changes in time, it is possible to reduce the speckles by averaging over time. If we define , then an image captured as part of our image stack is given by where is the camera magnification factor.4.Simulations4.1.Simulation of the Suggested SetupWe present simulation results to demonstrate the above description for the suggested set-up, as shown in Fig. 1. We want to use three images, calculated at three different focusing planes, all averaged over time, and then use a phase retrieval algorithm and calculate the phase, without knowing the layers width. We want to show how phase retrieval regenerates a diffraction pattern around the structure edges, which can be used to find the bone edges. A simulation is generated using MATLAB. A uniform matrix of , which represents illumination of 532-nm green light (as used in the experiment), is transferred through 30 mm of soft tissue using Kolmogorov approximation and then multiplied by bones transfer pattern [Fig. 3(a)]. The dark lines representing the bones are 3 mm wide. The bright line which is the distance between the bones is 3 mm as well. The signal then passes through an additional soft tissue, with similar parameters as the first one. We add Gaussian noise with mean 0 and 0.1 variance. The signal continues to propagate in free space, using the numerical Fresnel propagation integral for 8, 10, and 12 mm. We repeat the process 500 times and average over time [Figs. 3(b)–3(d)]. The image profiles created by averaging on the axis are presented in red on each image. One can notice in these images the blurring effect of the media, where the two lines almost merge into a single line. The images are downsampled by a factor of 2 and blurred using Gaussian blur with pixel for noise reduction. The three images are used to calculate the phase image, using the iterative multiplane phase retrieval algorithm. The phase and the captured amplitude are used to calculate the image at the exit plane (right after the second tissue). This image is shown in Figs. 3(e)–3(f) (amplitude and phase, respectively) with an additional plot of average profiles in the red line. One can notice how the gap between the bones and the diffraction pattern is much more noticeable than in the original image. In addition, we calculated the image after backpropagating additional 100 mm. This distance was selected empirically as a location in which the line pattern is the most apparent. The backpropagation image amplitude and phase are shown in Figs. 3(g)–3(h), respectively. 4.2.Simulation of the Technique AbilitiesWe use an additional MATLAB simulation in order to demonstrate the ability to infer spatial frequency structures, which are lower in comparison with the high scattering frequency content of the scattering media. In this simulation, each “tissue layer” is modeled as a uniform random phase distribution [] convolved with a Gaussian filter, in order to control its phase variance. The (“bone”) structure is not convolved with a Gaussian window in order to keep its maximum phase variance. The pixel size was set to . The random illumination pattern is simulated as a random phase pattern convolved with 25 pixels width and standard deviation of Gaussian filter. The pattern is zero padded to a size of , in order to avoid any wrap around effects. The wavelength was set to 532 nm. This wavefront propagates until it reaches the first layer, , simulated as a random phase distribution convolved with 25 pixels width and standard deviation of Gaussian filter. This size of the Gaussian kernel is chosen according to measurements of the reduced scattering coefficients for bones and breast tissues.5 The incoming wavefront is multiplied by the phase distribution of layer and propagates an additional until it reaches the scattering structure . This structure is defined by its scattering and nonscattering sections modeled as a grating with period [Fig. 4(a)]. The scattering portion of is composed from a random phase drawn from a uniform distribution between []. In order to simulate the effect of the magnitude attenuation difference between the chicken bones and chicken breast tissues, we simulate the magnitude as Rayleigh distribution with a scaling parameter of . After a propagation of , the wavefront emerging from reaches layer S2, which is modeled the same as , and is multiplied by its unique random phase distribution. In order to simulate the finite aperture of the proposed apparatus, the output wavefront from layer S2 was filtered in a way that only features with spatial resolution passed through. The wavefront propagates in free space and its magnitude is captured at distances of 2.5, 34, and 39 mm. We performed 100 iterations of the iterative multiplane phase retrieval algorithm. Using these three measured magnitudes, we recover the phase and magnitude distribution in the object plane. The simulation output phase and magnitude distribution in the object plane are shown in Figs. 4(a)–4(c) and after S2, shown by Fig. 4(d). The spatial phase distribution shown in Figs. 4(a)–4(b) and the coarse phase structures correspond to the structures of the propagating wavefront from , unscattered by , while the high-variance phase distribution corresponds to the scattered portion of the wavefront. When examining Figs. 4(e)–4(h), the modulating () structure of is evident in the spatial phase distribution. This reconstruction result is related to the choice of the defocusing distance and will be discussed later. 5.ExperimentIn this section, we wish to evaluate the proposed technique experimentally, as an example for bone imaging through thin soft tissue, where the bone location is desired. For the experimental demonstration, the sample was an ex vivo thin fibula chicken bone (1 to 2 mm) embedded between two layers of chicken breast and cut into thin layers of a few (5 to 10) millimeters. Chicken samples (chicken breasts and bones) were obtained, as parts, from a local supermarket, which is under the supervision of the Israeli veterinary services. No live chicken procedure is involved, which complies with BIU institutional regulations. The samples were stored in 4°C in the local supermarket. In the laboratory, the samples were stored at and were defrosted by keeping them at room temperature prior to the experiments. The reduced scattering coefficients of the selected tissues are approximately 20 and 5,6 for 532 nm. The reduced scattering coefficients of the chicken bones and breast are significantly different (about 10 magnitude) and are good examples for dense and soft tissues. For example, reduced scattering coefficients of human soft tissues compared to bones have a factor of about 2 to 3 in magnitude, for both 532 nm and near infrared (NIR).5 A diode pump green laser at 532-nm and 1-mm beam (Photop DPGL-2100F) was used as the source of illumination, at an output power of . The penetration depth of 532-nm green light is lower than NIR,38 however, since we chose chicken breast for the soft tissue, the green light penetration depth is sufficiently good for the technique demonstration. Several axial scan images were captured (typically three or four) using a CMOS camera as the input for the algorithm, which ran on a standard PC. Thus, this computational technique can be regarded as cost-effective with rather simple implementation, optically and algorithmically. 5.1.Sample PreparationTwo fibula chicken bones () were placed between two layers of sliced chicken breast ().5,6 The thinner parts of the fibula bones ( to 2 mm) were placed in the center of the sample, a few (0 to 10) mm apart, to test the ability of the suggested method to image bone shape and bone spacing with larger or similar dimensions. The chicken breast was cut 5 to 10 mm thick, which is larger than the transport mean free path of . The tissue was not treated to be uniform in thickness. The first layer of soft tissue was mounted on a black plastic board with a square hole of about 5 cm [Figs. 5(a)–5(b)]. The bones were placed on the soft tissue [Fig. 5(c)] and covered with the second layer of soft tissue [Fig. 5(d)]. The first soft tissue layer was glued to the plastic frame around the hole and the natural stickiness of the chicken breast was enough to hold the second layer and the bones. 5.2.Experimental SetupA linear setup [Fig. 5(e)] is assembled as follows: a green (532 nm) laser light source is placed behind a beam expander (THORLABS BE10M-A) and an optical diffuser. The purpose of this assemble is to create a few centimeters wide and almost homogenous illumination and having the setup as compact and as robust as possible. Specifically, for the described experiment, the laser beam is expanded by the beam expander to a 1-cm diameter and with the diffuser expanded to and then illuminating the sample, which is the perceived field of view. Another goal of the beam expansion is to decrease the laser power per area, so the light intensity and the energy absorbed by the tissue will be harmless. The maximum intensity measured after the beam expander is and just after the diffuser , which meets clinical safety standards (). The second goal of the diffuser, as mentioned, is to add variability in time to the speckle pattern, as the diffuser was lightly mounted and was affected by air vibrations. Variation in time is required in order to overcome speckle noise by averaging over time. A digital camera (THORLABS DCC1545) with a zoom imaging lens system (TOMRAN Japan 23FM25L—2/3″, 25-mm zoom lens) is placed from the biological object. The camera and lens are placed on a mechanical rail with a ruler, which enables to capture images at different defocusing distances. For each axial plane that the camera was positioned at, 30 frames at 30 ms per frame were acquired. 5.3.ResultsImages produced by a CMOS detector are , 8 bit grayscale. The pixel size is with magnification created using a standard C-mount zoom lens. Under these conditions, the optical system resolution without the presence of the tissues is . At first, a reference image [Fig. 6(a)], focused on the fibula bones while S2 is removed, is acquired. Following that, we acquired four image sequences (30 frames with ); each sequence was captured at a different axial position, with a different amount of defocusing. The distances were chosen to be at 0, 20, 40, and 60 mm from S2 outer plane. The 0 mm images, focusing on S2, are shown in Fig. 6(b) with additional brightness and contrast correction for better visual presentation of the details in this report. The last three distances are chosen in order to reject the high frequencies associated with the high spatial phase distribution variance of the bone and breast tissue, while still maintaining the low-medium spatial phase features of the breast tissue and bone tissue. Each 30-frame sequence was averaged. Different selection sets of images are used with the described algorithm: images 20, 40, and 60 mm for low frequency details [output Fig. 6(c)], and 0, 20, and 40 mm for high frequency details [output Fig. 6(d)]. In the postprocessing stage, the acquired images are denoised, aligned, using the maximum correlation method, cropped to a square of and downsampled by a factor of 2. After performing 5000 iterations of the iterative multiplane phase retrieval algorithm, the resulted spatial phase distribution divided by the amplitude is shown in Figs. 6(c) and 6(d). An x-ray microtomography image is also produced [Fig. 6(e)] as an additional reference image (SkyScan 1176 high resolution in vivo micro CT). Figure 6(f) is an output image using only two images: 20 and 60 mm from S2 (with the same processing, as described above). All images’ contrast was enhanced for better visual presentation. Using the reconstructed phase distribution (divided by the amplitude), the bone location and edges can be extracted. In this experiment, the soft tissue fibers (such as blood or collagen) are evident as well. Although these fibers are unwanted for the specific result, they demonstrate the technique’s capabilities of imaging different types of features. Due to their structure, they can be digitally removed. The results are in good agreement with the reference image obtained while having the S2 breast tissue removed [Fig. 6(a)] and x-ray imaging [Fig. 6(e)]. As predicted, at the acquired magnitude images [Fig. 6(b)], the bone features are less evident and seem blurred with low contrast when compared with the algorithm output [Figs. 6(c), 6(d), and 6(f)] and reference images [Figs. 6(a) and 6(c)]. Figure 6 demonstrates the advantages of the algorithm output image, when compared with the standard magnitude image. When examining the results shown in Figs. 6(a)–6(c), the fibula bones can be detected in all the images although in the focused magnitude image [Fig. 6(b)], the bones seem blurred and can hardly or not at all be separated by the naked eye. An additional feature of the tested subject is evident in the output [Figs. 6(c), 6(d), and 6(f)] as well as in the reference image [Fig. 6(b)]. A fine feature, an almost horizontal line, is just below the bones at the left-hand side. This feature is related to the tissue structure and possibly a small blood vessel. Additional features which are much similar are evident in the upper part of the image, just above the bones at the left-hand side these features are also visible at the output images Fig. 6(d), which correspond to high frequency details [and less at Figs. 6(c) and 6(f)]. These features are a good example of the technique abilities. It is also a demonstration of how the output depends on the choice of system parameters, such as wavelength. In order to validate our hypothesis that most of the relevant information for the extraction of the bones tissue location is associated with the low-medium spatial phase frequency range, we demonstrate phase reconstruction using the iterative multiplane phase retrieval employed on three acquired magnitude images at distances of 20, 40, and 60 mm (which is considered a high defocus distance). While the bone structure is indeed noticeable, it seems blurrier, along with other details, such as the chicken breast tissue fibers, when compared with Fig. 6(d). This result is in agreement with the theory,39 which argues that the high spatial frequency components will be washed off when reconstruction is carried out using images acquired at a large amount of defocusing. In a small amount of defocusing, the high frequency components are changing significantly between the acquired images, whereas the low frequency features do not. Since, in general, the multiplane phase retrieval promotes a solution whose lower spatial frequencies are more sensitive to noise when compared with the higher spatial frequencies,39 almost no bone edges are evident. In addition, the result that is shown in Fig. 6(f) demonstrates the spatial phase distribution reconstruction from only two slightly defocused images (focused on 20 and 60 mm from S2). To receive almost similar quality output as in previous reconstruction, the result is obtained using 10,000 iterations of the iterative multiplane phase retrieval algorithm. It is important to show the technique abilities with only two images since it can be easily achieved with a single shot (e.g., with a beam splitter and two sets of sensors and lens) for imaging a moving object. In Fig. 7(a), we demonstrate the performance of the proposed method, by calculating profiles of
The profiles are a calculated average over a 1-mm wide stripe, taken at approximately the center of the corresponding images (where the bones are almost horizontal). These profiles demonstrate a method to detect the line edge from the bones. The output red line shows a typical diffraction pattern of light passing through a two (almost) rectangle window, and with additional processing such as threshold or deconvolution, can be reduced to the window edge. This image is obtained without processing and demonstrates the soft tissue blurring effect and the technique. In this specific test, we chose the system and imaging parameters, so the tradeoff between small details and “noise” reduction is optimal for 1 to 2 mm objects, as the bones in the sample. We were able to ignore high spatial frequencies smaller than 0.1 mm. In order to keep smaller elements, one should reduce the “noise” level. Some of the “noise” is optically evident of the tissue structure. As one cannot control the tested tissue, other features can be changed, such as the wavelength, distances, and exposure time. More iterations did not improve the results in this case. If all of the above is not enough, it is possible to use additional techniques, such as multimodal imaging of several wavelength of several imaging distances. The benefits of these techniques are not discussed further in this report. This technique is limited by several physical and optical parameters. First, the wavelength should be such that the penetration depth is deep enough for light to pass though the sample, with a reasonable illumination intensity.5,40 In addition, the wavelength should be such that the reduced scattering coefficients of the two tissues should be different. This is a key feature for the suggested technique. The details and dimensions of the tissue features one wishes to image should be significantly different from the tissues structure details, as will be illustrated in the reported example of tissue fiber and fine blood vessels compared to bones. The dimensions of the desired details, such as the bones, should also be the same scale or larger than the thickness of the soft tissue. Phase information and the coherence memory length is only as long as the scattering media (soft tissue) dimension.41,42 Smaller details may be weakly evident. 6.DiscussionWe propose a computational imaging technique for imaging macroscopic features of a dense tissue object hidden within layers of soft tissue, using the spatial phase distribution reconstructed from several defocused intensity images. In this report, we have demonstrated significant improvement in visibility of biologically scattering features. The benefits of the suggested technique are obvious. It requires only some minor prior knowledge about the tested object, rather than mapping the soft tissue optical properties, as other techniques do.43–45 The main requirement is knowing which type of tissues are tested and making sure they have a different reduced scattering coefficient for a chosen wavelength. Some coefficients are known and can be taken from the literature.5 If such prior knowledge is not available, one can try experimenting with different wavelengths. Additional knowledge that is required is a general notion about the size and location. Since this test is a deblurring method, the object will be visible under the chosen wavelength, which can be a validation test for this requirement. The suggested technique is noninvasive, without the requirement of special treatment to the sample (such as the addition of contrast agents). The method is noninterferometric and simple to implement both in hardware and algorithmic aspects. The method produces a widefield diffraction pattern, which can be used to differentiate between two types of tissues, and is suitable for the examination of small, macroscopic structures inside biological tissues, such as bones, blood vessels, and fibers, and with some adaptation can be used to view dynamic events. Thus, the proposed technique completes, rather than competes with microscopic imaging techniques. The method is not limited to a particular laser wavelength (532 nm was used in these experiments), and NIR source of illumination can be used to increase penetration depth1,2,4,6,40,46 and reduced scattering, so a deeper image of bone location can be obtained. It will also be interesting for future research to compare the results obtained from the iterative multiplane phase retrieval algorithm to the one obtained using the transport of intensity equation method.47 ReferencesW. F. Cheong, S. A. Prahl and A. J. Welch,
“A review of the optical properties of biological tissues,”
IEEE J. Quantum Electron., 26
(12), 2166
–2185
(1990). http://dx.doi.org/10.1109/3.64354 IEJQA7 0018-9197 Google Scholar
V. V. Barun et al.,
“Absorption spectra and light penetration depth of normal and pathologically altered human skin,”
J. Appl. Spectrosc., 74
(3), 430
–439
(2007). http://dx.doi.org/10.1007/s10812-007-0071-2 JASYAP 0021-9037 Google Scholar
B. C. Wilson and S. L. Jacques,
“Optical reflectance and transmittance of tissues: principles and applications,”
IEEE J. Quantum Electron., 26
(12), 2186
–2199
(1990). http://dx.doi.org/10.1109/3.64355 IEJQA7 0018-9197 Google Scholar
S. L. Jacques,
“Skin optics summary,”
(2012) http://omlc.ogi.edu/news/jan98/skinoptics.html November 2012). Google Scholar
S. L. Jacques,
“Optical properties of biological tissues: a review,”
Phys. Med. Biol., 58
(11), R37
(2013). http://dx.doi.org/10.1088/0031-9155/58/11/R37 PHMBA7 0031-9155 Google Scholar
L. V. Wang, G. Marquez and S. L. Thomsen,
“Anisotropic absorption and reduced scattering spectra of chicken breast tissue measured using oblique incidence reflectometry,”
Proc. SPIE, 3250 33
(1998). http://dx.doi.org/10.1117/12.305386 PSISDG 0277-786X Google Scholar
I. Freund,
“Looking through walls and around corners,”
Phys. Stat. Mech. Appl., 168
(1), 49
–65
(1990). http://dx.doi.org/10.1016/0378-4371(90)90357-X PHYADX 0378-4371 Google Scholar
J. Bertolotti et al.,
“Non-invasive imaging through opaque scattering layers,”
Nature, 491
(7423), 232
–234
(2012). http://dx.doi.org/10.1038/nature11578 Google Scholar
H. He, Y. Guan and J. Zhou,
“Image restoration through thin turbid layers by correlation with a known object,”
Opt. Express, 21
(10), 12539
(2013). http://dx.doi.org/10.1364/OE.21.012539 OPEXFF 1094-4087 Google Scholar
O. Katz et al.,
“Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations,”
Nat. Photonics, 8
(10), 784
–790
(2014). http://dx.doi.org/10.1038/nphoton.2014.189 NPAHBY 1749-4885 Google Scholar
R. Tyson, Principles of Adaptive Optics, 3rd ed.CRC Press(2010). Google Scholar
I. M. Vellekoop and A. P. Mosk,
“Focusing coherent light through opaque strongly scattering media,”
Opt. Lett., 32
(16), 2309
(2007). http://dx.doi.org/10.1364/OL.32.002309 OPLEDP 0146-9592 Google Scholar
I. M. Vellekoop, A. Lagendijk and A. P. Mosk,
“Exploiting disorder for perfect focusing,”
Nat. Photonics, 4
(5), 320
–322
(2010). http://dx.doi.org/10.1038/nphoton.2010.3 NPAHBY 1749-4885 Google Scholar
O. Katz, E. Small and Y. Silberberg,
“Looking around corners and through thin turbid layers in real time with scattered incoherent light,”
Nat. Photonics, 6
(8), 549
–553
(2012). http://dx.doi.org/10.1038/nphoton.2012.150 NPAHBY 1749-4885 Google Scholar
J. Rosen and D. Abookasis,
“Noninvasive optical imaging by speckle ensemble,”
Opt. Lett., 29
(3), 253
(2004). http://dx.doi.org/10.1364/OL.29.000253 OPLEDP 0146-9592 Google Scholar
V. Bianco et al.,
“Imaging through scattering microfluidic channels by digital holography for information recovery in lab on chip,”
Opt. Express, 21
(20), 23985
–23996
(2013). http://dx.doi.org/10.1364/OE.21.023985 OPEXFF 1094-4087 Google Scholar
V. Bianco et al.,
“Clear coherent imaging in turbid microfluidics by multiple holographic acquisitions,”
Opt. Lett., 37
(20), 4212
–4214
(2012). http://dx.doi.org/10.1364/OL.37.004212 OPLEDP 0146-9592 Google Scholar
V. Bianco et al.,
“Self-propelling bacteria mimic coherent light decorrelation,”
Opt. Express, 23
(7), 9388
–9396
(2015). http://dx.doi.org/10.1364/OE.23.009388 OPEXFF 1094-4087 Google Scholar
V. Bianco et al.,
“Imaging adherent cells in the microfluidic channel hidden by flowing RBCs as occluding objects by a holographic method,”
Lab. Chip, 14
(14), 2499
–2504
(2014). http://dx.doi.org/10.1039/c4lc00290c LCAHAM 1473-0197 Google Scholar
Z. Wang, H. Ding and G. Popescu,
“Scattering-phase theorem,”
Opt. Lett., 36
(7), 1215
(2011). http://dx.doi.org/10.1364/OL.36.001215 OPLEDP 0146-9592 Google Scholar
I. Yariv et al.,
“Detecting nanoparticles in tissue using an optical iterative technique,”
Biomed. Opt. Express, 5
(11), 3871
(2014). http://dx.doi.org/10.1364/BOE.5.003871 BOEICL 2156-7085 Google Scholar
R. Gerchberg and O. Saxton,
“A practical algorithm for the determination of the phase from image and diffraction plane pictures,”
Optik, 35 237
–246
(1972). OTIKAJ 0030-4026 Google Scholar
D. L. Misell,
“A method for the solution of the phase problem in electron microscopy,”
J. Phys. Appl. Phys., 6
(1), L6
–L9
(1973). http://dx.doi.org/10.1088/0022-3727/6/1/102 Google Scholar
D. L. Misell,
“An examination of an iterative method for the solution of the phase problem in optics and electron optics: I. Test calculations,”
J. Phys. Appl. Phys., 6
(18), 2200
–2216
(1973). http://dx.doi.org/10.1088/0022-3727/6/18/305 Google Scholar
D. L. Misell,
“An examination of an iterative method for the solution of the phase problem in optics and electron optics: II. Sources of error,”
J. Phys. Appl. Phys., 6
(18), 2217
–2225
(1973). http://dx.doi.org/10.1088/0022-3727/6/18/306 Google Scholar
E. Gur and Z. Zalevsky,
“Image deblurring through static or time-varying random perturbation medium,”
J. Electron. Imaging, 18
(3), 033016
(2009). http://dx.doi.org/10.1117/1.3224953 JEIME5 1017-9909 Google Scholar
M. Aviv, E. Gur and Z. Zalevsky,
“Experimental results of revised Misell algorithm for imaging through weakly scattering biological tissue,”
Appl. Opt., 52
(11), 2300
–2305
(2013). http://dx.doi.org/10.1364/AO.52.002300 APOPAI 0003-6935 Google Scholar
J. Garcia-Sucerquia, J. A. H. Ramírez and D. V. Prieto,
“Reduction of speckle noise in digital holography by using digital image processing,”
Optik, 116
(1), 44
–48
(2005). http://dx.doi.org/10.1016/j.ijleo.2004.12.004 OTIKAJ 0030-4026 Google Scholar
Y. Pu and H. Meng,
“Intrinsic speckle noise in off-axis particle holography,”
J. Opt. Soc. Am. A, 21
(7), 1221
–1230
(2004). http://dx.doi.org/10.1364/JOSAA.21.001221 JOAOD6 0740-3232 Google Scholar
J. Maycock et al.,
“Reduction of speckle in digital holography by discrete Fourier filtering,”
J. Opt. Soc. Am. A, 24
(6), 1617
–1622
(2007). http://dx.doi.org/10.1364/JOSAA.24.001617 JOAOD6 0740-3232 Google Scholar
X. Cai,
“Reduction of speckle noise in the reconstructed image of digital holography,”
Optik, 121
(4), 394
–399
(2010). http://dx.doi.org/10.1016/j.ijleo.2008.07.026 OTIKAJ 0030-4026 Google Scholar
L. Rong et al.,
“Speckle noise reduction in digital holography by use of multiple polarization holograms,”
Chin. Opt. Lett., 8
(7), 653
–655
(2010). http://dx.doi.org/10.3788/COL CJOEE3 1671-7694 Google Scholar
T. Baumbach et al.,
“Improvement of accuracy in digital holography by use of multiple holograms,”
Appl. Opt., 45
(24), 6077
–6085
(2006). http://dx.doi.org/10.1364/AO.45.006077 APOPAI 0003-6935 Google Scholar
F. Monroy et al.,
“Quantitative assessment of lateral resolution improvement in digital holography,”
Opt. Commun., 281
(13), 3454
–3460
(2008). http://dx.doi.org/10.1016/j.optcom.2008.03.011 OPCOB8 0030-4018 Google Scholar
Theoretical Foundations and Numerical Methods for Sparse Recovery, Walter de Gruyter(2010). Google Scholar
D. C. Ghiglia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software, John Wiley and Sons, Inc., New York
(1998). Google Scholar
V. I. Tatarskii, Wave Propagation in a Turbulent Medium,
(2016) http://trove.nla.gov.au/work/19264447?q&versionId=45921527 January ). 2016). Google Scholar
V. V. Barun et al.,
“Absorption spectra and light penetration depth of normal and pathologically altered human skin,”
J. Appl. Spectrosc., 74
(3), 430
–439
(2007). http://dx.doi.org/10.1007/s10812-007-0071-2 JASYAP 0021-9037 Google Scholar
D. Paganin et al.,
“Quantitative phase-amplitude microscopy. III. The effects of noise,”
J. Microsc., 214
(1), 51
–61
(2004). http://dx.doi.org/10.1111/j.0022-2720.2004.01295.x JMICAR 0022-2720 Google Scholar
A. N. Bashkatov et al.,
“Optical properties of human skin, subcutaneous and mucous tissues in the wavelength range from 400 to 2000 nm,”
J. Phys. Appl. Phys., 38
(15), 2543
–2555
(2005). http://dx.doi.org/10.1088/0022-3727/38/15/004 Google Scholar
I. Freund, M. Rosenbluh and S. Feng,
“Memory effects in propagation of optical waves through disordered media,”
Phys. Rev. Lett., 61
(20), 2328
–2331
(1988). http://dx.doi.org/10.1103/PhysRevLett.61.2328 PRLTAO 0031-9007 Google Scholar
S. Feng et al.,
“Correlations and fluctuations of coherent wave transmission through disordered media,”
Phys. Rev. Lett., 61
(7), 834
–837
(1988). http://dx.doi.org/10.1103/PhysRevLett.61.834 PRLTAO 0031-9007 Google Scholar
S. M. Popoff et al.,
“Measuring the transmission matrix in optics: an approach to the study and control of light propagation in disordered media,”
Phys. Rev. Lett., 104
(10), 100601
(2010). http://dx.doi.org/10.1103/PhysRevLett.104.100601 PRLTAO 0031-9007 Google Scholar
Y. Choi et al.,
“Optical imaging with the use of a scattering lens,”
IEEE J. Sel. Top. Quantum Electron., 20
(2), 61
–73
(2014). http://dx.doi.org/10.1109/JSTQE.2013.2275942 IJSQEN 1077-260X Google Scholar
M. Kim et al.,
“Transmission matrix of a scattering medium and its applications in biophotonics,”
Opt. Express, 23
(10), 12648
–12668
(2015). http://dx.doi.org/10.1364/OE.23.012648 OPEXFF 1094-4087 Google Scholar
S. L. Jacques,
“Skin optics summary,”
Oregon Medical Laser Center News,
(1998). Google Scholar
E. D. Barone-Nugent, A. Barty and K. A. Nugent,
“Quantitative phase-amplitude microscopy I: optical microscopy,”
J. Microsc., 206
(3), 194
–203
(2002). http://dx.doi.org/10.1046/j.1365-2818.2002.01027.x JMICAR 0022-2720 Google Scholar
BiographyMaya Aviv Shalev is a PhD candidate at Bar-Ilan University, under the supervision of Prof. Zeev Zalevsky. She received her BS and MS degrees in physics from Tel-Aviv University in 2000 and 2009, respectively. Her current research interests include biomedical optics and imaging, phase retrieval, and image processing. Yair Rivenson is the coauthor of 45 refereed journal and conference papers (eight of them invited), 2 book chapters, and 1 publication in a special edition of Optics and Photonics News 2012 annual review. He has also won several Israeli national awards and recently obtained the ERC Marie Skolodwska-Curie Global Fellowship through the Horizon 2020/program. Amihai Meiri received his BSc degree in 2007 and PhD in 2013 from Bar Ilan University, Israel. His research focused on silicon photonics, metal nanoparticle based devices, and microscopy. As a post-doc at Duke University he focused on phase imaging using excitation of metal nanoparticles in living cells, and at the University of Utah on a super-resolution technique that utilizes interference to obtain resolution higher than the diffraction limit. Currently, he is with Bar Ilan University. Zeev Zalevsky received his BSc degree and direct PhD in electrical engineering from Tel-Aviv University in 1993 and 1996, respectively. Currently, he is a full professor in the Faculty of Engineering in Bar-Ilan University, Israel. His major fields of research are optical super-resolution, biomedical optics, nanophotonics, and electro-optical devices. He has published more than 420 peer reviewed journal papers, 6 books, and dozens of book chapters. He is a fellow of OSA, SPIE, EOS, IET, and NANOSMAT. |