Open Access
6 October 2015 Broadband quantitative phase microscopy with extended field of view using off-axis interferometric multiplexing
Author Affiliations +
Abstract
We propose a new portable imaging configuration that can double the field of view (FOV) of existing off-axis interferometric imaging setups, including broadband off-axis interferometers. This configuration is attached at the output port of the off-axis interferometer and optically creates a multiplexed interferogram on the digital camera, which is composed of two off-axis interferograms with straight fringes at orthogonal directions. Each of these interferograms contains a different FOV of the imaged sample. Due to the separation of these two FOVs in the spatial-frequency domain, they can be fully reconstructed separately, while obtaining two complex wavefronts from the sample at once. Since the optically multiplexed off-axis interferogram is recorded by the camera in a single exposure, fast dynamics can be recorded with a doubled imaging area. We used this technique for quantitative phase microscopy of biological samples with extended FOV. We demonstrate attaching the proposed module to a diffractive phase microscopy interferometer, illuminated by a broadband light source. The biological samples used for the experimental demonstrations include microscopic diatom shells, cancer cells, and flowing blood cells.

1.

Introduction

Imaging with a large field of view (FOV) is a general need of all imaging modalities and of interferometric imaging in particular,16 especially when imaging dynamic samples. The straightforward solution is, of course, decreasing the magnification at the expense of image resolution, a solution that is not suitable when fine details need to be observed. Another solution is using several exposures per sample instance, e.g., by temporal scanning and obtaining a larger FOV via stitching of several FOVs into a single one (e.g., Ref. 1). It is also possible to use super-resolution methods,26 such as a scanning synthetic numerical aperture for building a larger spatial-frequency domain from multiple low-resolution images. However, these scanning techniques are not suitable for dynamic samples that change faster than the scanning rate, or might move out of the FOV during scanning. Of course, cameras with more pixels can be used, but depending on the size of the image and its detailed dynamic, this solution might be too expensive or not feasible to implement.

In particular, interferometric quantitative phase microscopy can benefit from dynamically capturing larger FOVs, especially for fast biological dynamics79 and inline optical metrology.911 Interferometric imaging allows acquisition of the complex wavefront of the light beam interacting with a sample, containing both the sample amplitude and quantitative phase maps.

By inducing an off-axis angle between the reference and the sample beams creating the interference pattern on the camera, it is possible to reconstruct the entire complex wavefront from a single exposure, which is useful for quantitative dynamic imaging. The single exposure reconstruction is achievable since the off-axis angle creates a spatial carrier frequency that separates the sample wavefront from the sample intensity in the spatial-frequency domain (after a digital Fourier transform of the off-axis interferogram). This happens since in the spatial-frequency domain, the sample and reference intensities create autocorrelation terms located around the origin, whereas the sample wavefront is contained in the cross correlation between the sample and the reference waves and in its complex conjugate. Each of these cross-correlation terms is shifted to a different side of the spatial-frequency domain, while maintaining a full separation between each of them and the autocorrelation terms by controlling the off-axis angle between the beams. Therefore, to separate the sample wavefront in the reconstruction process, only one of the cross-correlation terms is chosen and is Fourier transformed back into the image domain.

We generally assume that the entire digital imaging system is well designed, so that the sampling rate on the camera corresponds to the resolution of the optical system; thus there is no oversampling that wastes the FOV. Reducing the magnification in this case will lead to undersampling and aliasing; thus degradation in the image quality. Therefore, in this case, it is not possible to demagnify the image to earn FOV extension. Specifically, in off-axis imaging interferometry, we need to further magnify the sample image to allow the camera to record both the image and the off-axis interference fringes with high spatial frequency.9

Typically, off-axis interference fringes are straight across one axis, which leaves empty space in the spatial-frequency domain across the other axis, under the reasonable assumption of symmetric spatial frequencies contents on both axes. This empty space can be used for optical encoding of at least one additional FOV from the sample, thus enabling the acquisition of more than one FOV at once. Based on this principle, we recently suggested the interferometry with double imaging area (IDIA).9 This technique uses an external interferometric module that connects to a conventional light microscope and transforms it into an interferometric microscope with extended FOV. This module assumes that the microscope is illuminated by a coherent or a partially-coherent light source (with a spectral bandwidth of up to 7 nm), while creating the reference beam at the exit of the microscope by optical spatial filtering using Fourier optics.12,13 Then the sample beam is optically folded by two orthogonally-oriented retroreflectors, creating a multiplexed interferogram on the camera, which contains the two sample FOVs.

However, this previous IDIA implementation is limited to coherent or partially-coherent illumination, and cannot operate in broadband illumination due to the off-axis angular tilt of one of the beams on the camera, which introduces a beam path difference that is larger than the coherence length of the broadband light source.

In the current paper, we propose a more generic IDIA module that can be connected to the existing off-axis interferometric imaging systems and double their interferometric FOV, while seamlessly enabling off-axis broadband interferometry, provided that the existing off-axis interferometric system is designed to meet this goal. In contrast to the previous IDIA implementation,9 the current one does not create the off-axis interference inside the module, but instead it assumes that the existing interferometric imaging system already creates this off-axis interference before reaching the module. Thus, this add-on module upgrades the capabilities of existing off-axis interferometric imaging systems, since it enables recording more interferometric information using the same number of camera pixels, without resolution or magnification loss, while sharing the dynamic range of the camera. We experimentally present attaching this new module to a broadband off-axis interferometric phase imaging setup, and demonstrate quantitative phase imaging of fast biological dynamics in extended FOV.

2.

External Module Design

According to the IDIA principle, two areas from the sample are simultaneously projected onto a camera sensor, each of which encodes a different FOV of the sample with straight off-axis interference fringes in the orthogonal direction in respect to the fringe direction in the other FOV. In contrast to the previous IDIA module, in which the off-axis superposition between the sample and the reference beams is performed inside the module, the current module assumes that the off-axis beam superposition is already generated by the interferometric imaging system located before the module.

Fig. 1

The suggested multiplexing module, connected before the output of an existing off-axis imaging interferometer. ES+ER the off-axis superposition of the sample and the reference fields, as created by the off-axis imaging interferometer. BS, beam splitter; RR1, right-angle mirror retroreflector; RR2, 45-deg rotated right-angle mirror retroreflector. Note that to clarify the ray tracing scheme, we do not show the small off-axis angle between ES and ER, although it exists in both channels.

JBO_20_11_111217_f001.png

The proposed module is shown in Fig. 1 and should be located at the output of the existing imaging interferometer, so that the image plane of the system will be positioned at the output plane of the module. The superposition between the sample and reference waves enters the external module and is split by a beam splitter BS (made of a thin plate to decrease dispersion) into two pairs of sample and reference waves. Then each of the sample/reference beam pairs is reflected by a retroreflector, and projected back to the camera. The two retroreflectors, RR1 and RR2, are 45 deg rotated with respect to each other around the optical axis (see three-dimensional subset in Fig. 1). The 45-deg rotation of RR2 rotates the sample/reference beam pair by 90 deg around the optical axis. For this reason, the off-axis interference fringes derived from RR2 are orthogonal to these derived from RR1, leading to a single optically multiplexed interferogram on the camera. The optical alignment of the two retroreflectors can be set so that each of them will contain the same or a different FOV from the sample, where each FOV is also 90 deg rotated with respect to the other one, as can be seen in Fig. 1.

In contrast to the external interferometric module presented previously,9 the current one is different in its internal design, since it does not use lenses and the retroreflectors are 45 deg rotated with respect to each other, rather than 90 deg rotated. In addition, the proposed module is also different from the previous one in the way of operation, as it rotates both the sample and the reference waves as a pair. As such, this new module does not affect the coherence properties of the existing off-axis interferometer. Most significantly, as long as the existing interferometer allows broadband off-axis interference on the entire camera sensor, the proposed module can double the interferometric FOV even for the broadband illumination case.

The module presented in Fig. 1 is located between the output of the existing interferometer and its camera, where the image plane is located on the camera sensor. Thus, we assume an image interferogram. However, in case that there is no space before the output of the interferometric imaging system for positioning the module, an additional 4f imaging lens configuration can be integrated after the interferometric system output to project the image plane of the interferometric system onto the camera, which leaves space for positioning the proposed multiplexing module.

The multiplexed interferogram, recorded by the digital camera in a single exposure, can be mathematically expressed as follows:

Eq. (1)

|[ES+ER](RR1)+[ES+ER](RR2)|2=|ES(RR1)|2+|ER(RR1)|2+|ES(RR2)|2+|ER(RR2)|2+ES(RR1)ER*(RR1)+ER(RR1)ES*(RR1)+ES(RR2)ER*(RR2)+ER(RR2)ES*(RR2)+{ES(RR1)[ES+ER]*(RR2)+ES(RR2)[ES+ER]*(RR1)+ER(RR1)ER*(RR2)+ER(RR2)ER*(RR1)+ER(RR1)ES*(RR2)+ER(RR2)ES*(RR1)},
where ES(RR1) and ER(RR1) represent the sample and the reference beams reflected from the unrotated retroreflector (RR1), and ES(RR2) and ER(RR2) represent the sample and reference beams reflected from the 45-deg rotated retroreflector (RR2), inducing a 90-deg rotation of the interference fringes and the image. The first four elements on the right side of Eq. (1) represent the intensities of the sample and reference beams, and in the spatial-frequency domain they are located around the center of the axes. The next four elements in this equation represent the two cross-correlation pairs of the two FOVs, each reflected from a different retroreflector. These cross-correlation terms are shifted from the center of the spatial-frequency domain due to the off-axis angle between the reference and the sample beams, and each cross-correlation pair is perpendicularly located on the spatial-frequency domain compared with the other pair due to the 90-deg rotation induced by the retroreflector orientation.

The rest of the terms in Eq. (1) (the terms that appear in curly brackets) are cross-terms between the light beams reflected from RR1 and RR2, and should be avoided by either using orthogonal polarizations or coherence gating effects, so that the light from RR1 will not interfere with the light from RR2. For example, if using a broadband illumination source, it is possible to use coherence gating to avoid these unwanted cross-terms by slightly shifting one of the retroreflectors on the optical axis with respect to the other one, so that the light from RR1 will not interfere with the light from RR2 due to an optical path difference which is larger than the coherent length of the source. This axial movement needs to be shorter than the depth of field to prevent out of focusing of the field.

3.

Extraction of the Double-Area Complex Wavefront

As shown in Fig. 2, the multiplexed interferogram is created by optically generating two off-axis interferograms on the camera plane at once, where each of them encodes a different imaging wavefront, and one of them is 90 deg rotated with respect to the other one. After acquiring this multiplexed interferogram in a single camera exposure, digital two-dimensional (2-D) Fourier transform is performed. Subsequently, we select one cross-correlation term from each pair, and both cross-correlation terms are processed separately by a digital 2-D inverse Fourier transform analysis to obtain the complex wavefronts of the two sample FOVs. To compensate for static phase aberrations, another multiplexed interferogram is acquired, but this time without the sample present. Two sample-free complex wavefronts are extracted for the two FOVs, and then each of the complex wavefronts with the sample information in it is divided, pixel by pixel, by the corresponding sample-free complex wavefront. For each FOV, the argument of the resulting complex wavefront is a wrapped phase map, which can be digitally unwrapped, if needed, to correct for 2π ambiguities.14 A detailed explanation on the reconstruction process can be found in Refs. 13 and 14. Finally, one of the phase maps needs to be 90-deg rotated to match the other phase map orientation. The retroreflector positions can be set so that the two FOVs of the sample, encoded into the multiplexed interferogram, are continuous, so that after the reconstruction, the phase maps can be stitched together to get a single extended phase map by using a continuous object appearing in both FOVs as a guide.

Fig. 2

Digital processing of the multiplexed interferogram into two different quantitative phase maps, which can be stitched together. 2D-FT, two-dimensional Fourier transform; 2D-FT1, two-dimensional inverse Fourier transform.

JBO_20_11_111217_f002.png

4.

Experimental Setup

In order to demonstrate the interferometric module capabilities, we integrated it into an off-axis interferometric imaging system designed to operate with a temporally incoherent (broadband) source. The chosen broadband off-axis interferometer, which is used for the experimental demonstrations, is based on diffraction phase microscopy (DPM).15,16 After the transmission microscope, the DPM interferometer uses a diffraction grating to split the beam and maintains the coherence between the first- and zero-diffraction orders, which enables obtaining an off-axis interference pattern on the entire camera plane.

In the transmission microscope, we used 40×, 0.66-NA, infinity-corrected microscope objective MO, and tube lens L1 with a 150-mm focal length, positioned in a 4f configuration with the MO. As the broadband illumination source of the microscope, we used a supercontinuum fiber-laser source (SC400-4, Fianium), connected to a computer-controlled acousto-optical tunable filter (SC–AOTF, Fianium), which is tuned to a central wavelength of 518 nm with full width at half maximum bandwidth of 42 nm, as measured by a spectrometer (USB4000-VIS-NIR, Ocean Optics). This is the broadest continuous spectral bandwidth possible with this AOTF, although broader spectral bandwidths can work in this off-axis interferometer as well.15 Under this configuration, the lateral and axial resolution limits with and without the IDIA module were compared and found to be 0.69 and 1.7μm, respectively, and the coherence length measured was 2.5μm. As expected and confirmed experimentally, since there is no overlap in the FOVs in the spatial-frequency domain, the resolution limit with and without the IDIA module remains unchanged. In the DPM interferometer, located at the image plane of the microscope, we used a 92lines/mm blaze diffraction grating, followed by a 4f lens configuration that is composed of two achromatic lenses with total magnification of 1.33, L2 (150-mm focal length) and L3 (200-mm focal length). Between the two lenses, at the Fourier plane, a 10-μm pinhole was placed in the beam path of the zero-diffraction order of the grating, in order to filter out the high-spatial frequencies and create a reference beam. An additional larger hole in the pinhole was designed to pass only the first-diffraction order of the grating without spatial filtering of the image, and block all other diffraction orders. Typically, off-axis interferogram would be created on the image plane located on the digital camera (DCC1545M, Thorlabs, containing 1024×1280 square pixels of 5.2μm each). However, right after lens L3 and before the camera, we integrated the proposed module, which allowed FOV doubling in the broadband off-axis imaging interferometer.

5.

Experimental Results

The first experimental demonstration includes capturing microscopic diatom shells which, under the chosen magnification, are wider than the camera FOV, and thus cannot be recorded in a single exposure in the conventional way. We used the broadband off-axis interferometer presented in Fig. 3. Figure 4(a) presents the recorded multiplexed broadband off-axis interferogram containing the two FOVs, represented by two interference patterns with perpendicular fringe orientations, and recorded using a single camera exposure. After a digital Fourier transform, each of the two orthogonal interference patterns creates a pair of cross correlation terms in the spatial-frequency domain [Fig. 4(b)], which are processed into two separate quantitative phase maps, and stitched together using the digital process elaborated in Sec. 3. The alignment of the retroreflectors in the FOV doubling module was set so that there was a small overlap between the two FOVs, which helped in stitching them together. Figure 4(c) shows the stitched quantitative phase map of the diatom-shell sample, with a 93% increase in the imaging area compared to using the conventional off-axis imaging setup.

Fig. 3

Attaching the proposed FOV-doubling module to a broadband off-axis interferometer [diffraction phase microscopy (DPM) interferometer]. This interferometer is connected to a transmission microscope. The microscope is illuminated with a supercontinuum (SC) laser, connected to an acousto-optical tunable filter (AOTF), inducing spectral bandwidth of 42 nm. M, mirror; MO, microscope objective, L1, tube lens; L2 and L3, lenses in the DPM interferometer; G, diffraction grating; P, pinhole; RR1, right-angle mirror retroreflector; RR2, 45-deg rotated right-angle mirror retroreflector.

JBO_20_11_111217_f003.png

Fig. 4

Microscopic diatom shells, as captured by the broadband off-axis interferometer enhanced with the proposed FOV-doubling module. (a) The multiplexed interferogram captured by the digital camera. In red boxes—a magnified image of the multiplexed interference fringes, and the two off-axis interference fringe patterns composing it. (b) Absolute values of the digital spatial-frequency (Fourier) domain of the interferogram. In white boxes—the chosen cross-correlation terms for the two FOV extractions. (c) The stitched quantitative phase map.

JBO_20_11_111217_f004.png

The same system was used to obtain extended FOV in imaging of melanoma human cancer cells in vitro. Figure 5(a) presents the recorded broadband multiplexed off-axis interferogram containing the two FOVs, as recorded using a single camera exposure. Figure 5(b) shows the stitched quantitative phase map. When a significantly extended FOV of a sample needs to be scanned to obtain high-resolution quantitative imaging analysis of large populations of cells, the proposed technique can speed up the acquisition process, thus the throughput of the analysis technique can be higher. Examples of possible applications that can benefit from this technique are quantitative blood smear analysis17 and tissue phase imaging.18 Another relevant example, where high-throughput detailed imaging is needed to enable scanning of extended samples is silicon wafer full profiling.19 In these cases, the proposed technique can obtain a full scan of the sample in half of the time usually required, since two sample FOVs are acquired per each camera exposure.

Fig. 5

Melanoma cancer cells, as captured by the broadband off-axis interferometer, enhanced with the proposed FOV-doubling module. (a) The multiplexed interferogram captured by the digital camera. In red boxes—magnified image of the multiplexed interference fringes, and the two off-axis interference fringe patterns composing it. (c) The stitched quantitative phase map.

JBO_20_11_111217_f005.png

Next, we present a quantitative phase imaging demonstration of dynamic blood flow between two coverslips. This experimental demonstration illustrates the dynamic capabilities of the suggested module, being able to capture two interferometric FOVs simultaneously without loss of temporal resolution. The sample was prepared by putting the blood between two coverslips and inducing pressure between the two sides of the chamber to induce cell flow. The resulting multiplexed off-axis interferogram acquired by the camera is shown in Fig. 6(a) and Video 1. In Video 1, it is possible to see the simultaneous vertical and horizontal flow directions of the cells caused by the 90-deg rotation between the two adjacent FOVs. In the right bottom corner of the video, a selected enlarged area from the multiplexed interference fringe pattern is shown. The final doubled FOV stitched phase map is shown in Fig. 6(b) and Video 2. Note that due to the short coherence length of the source, thick cells that are aligned vertically during flowing might have several empty phase points.

Fig. 6

Blood flow between two coverslips, as captured by the broadband off-axis interferometer, enhanced with the proposed FOV doubling module. (a) The recorded multiplexed interferogram (Video 1, MPEG, 2.76 MB) [URL: http://dx.doi.org/10.1117/1.JBO.20.11.111217.1]. In red boxes—magnified image of the multiplexed interference fringes, and the two off-axis interference fringe patterns composing it. (b) The reconstructed stitched phase map (Video 2, MPEG, 0.9 MB) [URL: http://dx.doi.org/10.1117/1.JBO.20.11.111217.2].

JBO_20_11_111217_f006.png

6.

Conclusions

We have presented a new technique to double the FOV of existing off-axis interferometric imaging systems, including broadband off-axis interferometers. The method is based on the fact that off-axis interferometry leaves an empty space in the spatial-frequency domain, in which additional information from other areas of the sample can be inserted. The encoding is done optically by folding the optical FOV and projecting a multiplexed interferogram, which is composed of two interferograms with orthogonal fringe directions. In conventional off-axis interferometry, there is a decrease in the recorded FOV on the camera due to the higher sampling rate required for capturing the interference carrier fringe frequency, leading to less efficient spatial-bandwidth consumption on the camera. We were able to partially compensate for this inefficiency by better exploiting the spatial-frequency domain, consequently allowing digital recording of a larger FOV with respect to the FOV obtainable by typical off-axis interferometers. In contrast to the previous FOV doubling implementation,9 the current one does not affect the coherence properties of the existing off-axis imaging interferometer, and thus allows extending the FOV even when using broadband illumination. On the other hand, the extended FOV obtained by the proposed portable module has the same limitations of the FOV obtained directly by the existing interferometer, to which the module connects. For example, if the module connects to a highly coherent interferometer, which induces coherent noises, the extended FOV will have the same types of noises. In addition, the proposed method has two limitations, which are dependent directly on the interferogram multiplexing operation. The first one is that the two interferograms composing the multiplexed inteferogram share the same dynamic range of the camera. This can be solved, if needed, by using a camera with a higher dynamic range. Second, the multiplexing operation creates unwanted cross-terms [appearing in curly brackets in Eq. (1)]. To avoid aliasing with the wanted terms, these cross-terms can be eliminated by making sure that the two beams creating each of the cross-terms will have a beam-path difference which is larger than the coherence length of the broadband source used. Alternatively, these cross-terms can be eliminated by ensuring that the beams creating them have orthogonal polarizations. The latter solution is viable even if the module is connected to a highly coherent interferometer, in which the first solution of the coherent gating cannot be used.

The method allows parallel acquisition of two FOVs of the sample in a single camera exposure, and thus simultaneously obtains two sample complex wavefronts, from which two quantitative phase maps can be extracted. Since two sample FOVs are simultaneously recorded in a single camera exposure, the proposed technique is useful to gain higher throughput in high-resolution quantitative phase microscopy when scanning extended samples, such as blood smears, since now it takes half of the time to complete the full scan of the sample. In addition, the technique is useful for quantitative phase microscopy of dynamic processes in a doubled FOV.

Our experimental demonstrations illustrate the advantages of the method for quantitative phase imaging of biological cells. However, this technique is relevant for FOV doubling in off-axis holographic and interferometric metrology of thin elements such as wafer patterns, as well as useful for holographic imaging of macroscale objects.

Acknowledgments

This research was supported by the FP7 Marie Curie Career Integration Grant (CIG) No. 303559. We thank Dr. Ksawery Kalinowski for useful remarks.

References

1. 

M. Mir et al., “Label-free characterization of emerging human neuronal networks,” Sci. Rep., 4 4434 (2014). http://dx.doi.org/10.1038/srep04434 SRCEC3 2045-2322 Google Scholar

2. 

W. Choi et al., “Tomographic phase microscopy,” Nat. Methods, 4 (9), 717 –719 (2007). http://dx.doi.org/10.1038/nmeth1078 1548-7091 Google Scholar

3. 

J. Di et al., “High resolution digital holographic microscopy with a wide field of view based on a synthetic aperture technique and use of linear CCD scanning,” Appl. Opt., 47 (30), 5654 (2008). http://dx.doi.org/10.1364/AO.47.005654 APOPAI 0003-6935 Google Scholar

4. 

Y. Bishitz et al., “Optical-mechanical signatures of cancer cells based on fluctuation profiles measured by interferometry,” J. Biophotonics, 7 (8), 624 –630 (2014). http://dx.doi.org/10.1002/jbio.201300019 Google Scholar

5. 

P. Girshovitz and N. T. Shaked, “Doubling the field of view in off-axis low-coherence interferometric imaging,” Light Sci. Appl., 3 (3), e151 (2014). http://dx.doi.org/10.1038/lsa.2014.32 Google Scholar

6. 

C. J. Mann et al., “High-resolution quantitative phase-contrast microscopy by digital holography,” Opt. Express, 13 (22), 8693 (2005). http://dx.doi.org/10.1364/OPEX.13.008693 OPEXFF 1094-4087 Google Scholar

7. 

C. J. Mann, L. Yu and M. K. Kim, “Movies of cellular and sub-cellular motion by digital holographic microscopy,” Biomed. Eng. Online, 5 (1), 21 (2006). http://dx.doi.org/10.1186/1475-925X-5-21 Google Scholar

8. 

M. Paturzo et al., “Super-resolution in digital holography by a two-dimensional dynamic phase grating,” Opt. Express, 16 (21), 17107 (2008). http://dx.doi.org/10.1364/OE.16.017107 OPEXFF 1094-4087 Google Scholar

9. 

A. Finizio et al., “A digital holographic microscope for complete characterization of microelectromechanical systems,” Meas. Sci. Technol., 15 (3), 529 –539 (2004). http://dx.doi.org/10.1088/0957-0233/15/3/005 MSTCEP 0957-0233 Google Scholar

10. 

N. T. Shaked, “Quantitative phase microscopy of biological samples using a portable interferometer,” Opt. Lett., 37 (11), 2016 –2018 (2012). http://dx.doi.org/10.1364/OL.37.002016 OPLEDP 0146-9592 Google Scholar

11. 

P. Girshovitz and N. T. Shaked, “Compact and portable low-coherence interferometer with off-axis geometry for quantitative phase microscopy and nanoscopy,” Opt. Express, 21 (5), 5701 –5714 (2013). http://dx.doi.org/10.1364/OE.21.005701 OPEXFF 1094-4087 Google Scholar

12. 

M. A. Herráez et al., “Fast two-dimensional phase-unwrapping algorithm based on sorting by reliability following a noncontinuous path,” Appl. Opt., 41 (35), 7437 (2002). http://dx.doi.org/10.1364/AO.41.007437 APOPAI 0003-6935 Google Scholar

13. 

P. Girshovitz and N. T. Shaked, “Real-time quantitative phase reconstruction in off-axis digital holography using multiplexing,” Opt. Lett., 39 (8), 2262 (2014). http://dx.doi.org/10.1364/OL.39.002262 OPLEDP 0146-9592 Google Scholar

14. 

P. Girshovitz and N. T. Shaked, “Fast phase processing in off-axis holography using multiplexing with complex encoding and live-cell fluctuation map calculation in real-time,” Opt. Express, 23 (7), 8773 –8787 (2015). http://dx.doi.org/10.1364/OE.23.008773 OPEXFF 1094-4087 Google Scholar

15. 

Y. Park et al., “Diffraction phase microscopy,” TuI50 (2006). Google Scholar

16. 

B. Bhaduri et al., “Diffraction phase microscopy with white light,” Opt. Lett., 37 (6), 1094 (2012). http://dx.doi.org/10.1364/OL.37.001094 OPLEDP 0146-9592 Google Scholar

17. 

M. Mir, K. Tangella and G. Popescu, “Blood testing at the single cell level using quantitative phase and amplitude microscopy,” Biomed. Opt. Express, 2 (12), 3259 –3266 (2011). Google Scholar

18. 

Z. Wang et al., “Tissue refractive index as marker of disease,” J. Biomed. Opt. , 16 (11), 116017 (2011). http://dx.doi.org/10.1117/1.3656732 Google Scholar

19. 

R. Zhou et al., “Detecting 20 nm wide defects in large area nanopatterns using optical interferometric microscopy,” Nano Lett., 13 (8), 3716 –3721 (2013). Google Scholar

Biography

Pinhas Girshovitz is a full-time, direct-track PhD student in the Department of Biomedical Engineering at Tel Aviv University, Israel. He holds a BSc (with honor) and an MSc in biomedical engineering at Tel Aviv University. In July 2011, he started working on his PhD thesis, under the supervision of Dr. Natan T. Shaked, titled: “Advanced techniques for compact interferometric systems for measurements in biological cells.” During his PhD work, he coauthored more than 10 refereed journal papers.

Irena Frenklach is an MSc student in the Department of Biomedical Engineering at Tel Aviv University. She finished her BSc in biomedical engineering at Tel Aviv University in September 2012, with majors in biomedical signals and systems. In October 2012, she started her MSc thesis, under the supervision of Dr. Natan T. Shaked, titled: “Development of analytic tools and new parameters based on interferometric methods for phase measurement in biological cells.” During her MSc work, she coauthored two refereed journal papers.

Natan T. Shaked is a senior lecturer in the Department of Biomedical Engineering at Tel Aviv University, Israel. His research subjects include optical microscopy, nanoscopy, and interferometry for biomedical applications. He is the coauthor of more than 45 refereed journal papers and 70 conference papers (including 16 invited papers), and several book chapters, patents, and an edited book.

© 2015 Society of Photo-Optical Instrumentation Engineers (SPIE) 1083-3668/2015/$25.00 © 2015 SPIE
Pinhas Girshovitz, Irena Frenklach, and Natan T. Shaked "Broadband quantitative phase microscopy with extended field of view using off-axis interferometric multiplexing," Journal of Biomedical Optics 20(11), 111217 (6 October 2015). https://doi.org/10.1117/1.JBO.20.11.111217
Published: 6 October 2015
Lens.org Logo
CITATIONS
Cited by 14 scholarly publications and 2 patents.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Interferometry

Multiplexing

Cameras

Interferometers

Microscopy

Imaging systems

Retroreflectors

Back to Top