Traditional imaging systems directly image a 2D object plane on to the sensor. Plenoptic imaging systems contain a
lenslet array at the conventional image plane and a sensor at the back focal plane of the lenslet array. In this
configuration the data captured at the sensor is not a direct image of the object. Each lenslet effectively images the
aperture of the main imaging lens at the sensor. Therefore the sensor data retains angular light-field information which
can be used for a posteriori digital computation of multi-angle images and axially refocused images. If a filter array,
containing spectral filters or neutral density or polarization filters, is placed at the pupil aperture of the main imaging
lens, then each lenslet images the filters on to the sensor. This enables the digital separation of multiple filter modalities
giving single snapshot, multi-modal images. Due to the diversity of potential applications of plenoptic systems, their
investigation is increasing. As the application space moves towards microscopes and other complex systems, and as
pixel sizes become smaller, the consideration of diffraction effects in these systems becomes increasingly important. We
discuss a plenoptic system and its wave propagation analysis for both coherent and incoherent imaging. We simulate a
system response using our analysis and discuss various applications of the system response pertaining to plenoptic
system design, implementation and calibration.
Plenoptic cameras are designed to capture different combinations of light rays from a scene, sampling its lightfield.
Such camera designs capturing directional ray information enable applications such as digital refocusing, rotation,
or depth estimation. Only few address capturing spectral information of the scene. It has been demonstrated
that by modifying a plenoptic camera with a filter array containing different spectral filters inserted in the
pupil plane of the main lens, sampling of the spectral dimension of the plenoptic function is performed. As a
result, the plenoptic camera is turned into a single-snapshot multispectral imaging system that trades-off spatial
with spectral information captured with a single sensor. Little work has been performed so far on analyzing
diffraction effects and aberrations of the optical system on the performance of the spectral imager. In this paper
we demonstrate simulation of a spectrally-coded plenoptic camera optical system via wave propagation analysis,
evaluate quality of the spectral measurements captured at the detector plane, and demonstrate opportunities for
optimization of the spectral mask for a few sample applications.
The lateral resolution of an imaging system is limited by its numerical aperture and the wavelength.
Structured illumination incident on the object heterodynes the higher spatial frequencies of the object with the
spatial frequency of the sinusoidal illumination into the passband of the imaging system providing lateral
superresolution. This idea has been implemented in microscopy. Multiple images of an object are taken, with
distinct phase shifts in the sinusoidally patterned illumination. They are processed to separate the
conventional, un-aliased object spatial frequencies from the aliased ones, which contain superresolution
information. The separated aliased terms are de-aliased (i.e. the spatial frequencies in them are moved to their
correct locations in Fourier space) giving superresolution along the direction perpendicular to the orientation
of the sinusoidal fringe pattern. This process is repeated with, say, 60° and 120° rotation of the sinusoidal
fringe illumination to obtain superresolution in all directions. The final reconstructed image can be obtained
by appropriate combination of the de-aliased superresolution components with the conventional, un-aliased
components. We discuss the signal-to-noise ratio (SNR) and optical transfer function (OTF) compensation in
the combination of all these components to obtain an image with lateral superresolution.