Digital holography (DH) is a technique to reconstruct the amplitude and phase images of a sample by calculating the wavefront propagation from the interference image. Although DH enables three-dimensional shape measurement based on the phase images, axial dynamic range of a single-optical-wavelength DH is limited to less than a full or half optical wavelength due to phase wrapping ambiguity. To extend the axial range over the optical wavelength, synthesized wavelength DH has been proposed. In this method, DH is performed at two different wavelengths, and then synthesized wavelengths between them are used. However, use of a single longer synthesized wavelength degrades the axial resolution because the axial dynamic range is limited by the phase noise. To extend the axial dynamic range, one has to increase the axial range while maintaining the axial resolution of sub-wavelength. One promising approach to do it is cascade linking between multiple synthetic wavelengths with different orders. In this paper, we present multicascadelinked synthetic wavelength DH using an optical-comb-referenced frequency synthesizer (OFS). OFS is a tunable external cavity laser diode phase-locked to an optical frequency comb, and is effectively used for multiple synthetic wavelengths within the range of 32 um to 1.20 m. A multiple cascade link of the phase images among an optical wavelength and 5 different synthetic wavelengths enables the shape measurement of a reflective millimeter-sized stepped surface with the axial resolution of 34 nm.
We present a simple method based on the acquisition of a back-illuminated pinhole to estimate the point spread function (PSF) for CCD (or CMOS) sensor characterization. This method is used to measure the variations in sensitivity of the 2D-sensor array systems. The experimental results show that there is a variation in sensitivity for each position on the CCD of the calibrated camera and the pixel optical center error with respect to the geometrical center is in the range of 1/10th of a pixel. We claim that the pixel error comes most probably from the coherence of the laser light used, or eventually from possible defects in shape, surface quality, optical performance of micro-lenses, and the uniformity of the parameters across the wafer. This may have significant consequences for coherent light imaging using CCD (or CMOS) such as Particle Image Velocimetry.