Open Access
17 September 2024 Neural network enables ultrathin flat optics imaging in full color
Author Affiliations +
Abstract

The article comments on a recently developed neural network that enables ultrathin flat optics imaging in full color.

Alongside advancements in computational design tools and nanofabrication technology, the field of meta-optics has grown rapidly in the recent decade. Meta-optics have the potential to dramatically miniaturize imaging systems for applications including endoscopy, autonomous drones and vehicles, and consumer electronics.

However, two key challenges that have limited imaging quality with meta-optics are chromatic aberrations and limited field of view (FoV). Recent works have made great progress towards solving these issues separately, but simultaneous achievement of full-color and wide FoV has remained elusive. For chromatic aberration correction, one demonstrated approach is to engineer the dispersion of the meta-atoms to compensate for chromatic aberration.1 However, meta-optics employing this approach are typically limited to either small aperture or narrow FoV, due to the group delay dispersion that can be obtained by engineering the meta-atoms alone.2 Similarly, meta-optics designed for wide FoV functionality have been highly successful under monochromatic illumination, but not under broadband illumination.3,4

In a recent report,5 Yan Liu, Jian-Wen Dong, and their team address these issues using a monochromatic wide FoV metalens with a neural network-based image reconstruction technique to simultaneously achieve both wide FoV and full-color imaging in a compact meta-optical system. Several recent studies have reported high-quality full-color imaging via a meta-optic in combination with neural network-based reconstruction algorithms, but for a modest 30°–40° FoV.6,7 To increase the FoV, this new study reports combining a wide FoV metalens with a neural network based on transformer-based network.

To design the wide FoV metalens, they adopt the demonstrated approach of combining the metalens with an external aperture (located on the back of the metasurface substrate) to control incident illumination, thereby optimizing each section of the metalens for a particular angle of incidence. The wide FoV performance of this metalens is nearly diffraction-limited at the design wavelength of 532 nm; however, non-optimal wavelengths are dim and unfocused. These issues are corrected using the computational reconstruction (Fig. 1).

Fig. 1

The full-color meta-camera reported in Ref. 5. A wide FoV metalens and transformer neural network synergistically achieve high quality full-color imaging in the visible band.

AP_6_5_050502_f001.png

A key innovation in this work is the use of a knowledge-fused data-driven transformer network to computationally restore images. Transformer neural networks have recently gathered attention for their success in modeling long-range dependencies in deep learning tasks. The authors hypothesize that this kind of network enables the transformer to model long-range dependencies between the non-focused off-axis point spread functions and central bright speckle. Wide FoV metalenses exhibit complex spatially varying aberrations that are difficult to correct using traditional neural network architectures, such as convolutional neural networks that perform local convolutions with a kernel. To highlight the effectiveness of the demonstrated transformer-based reconstruction, the authors compare imaging performance of the bare metalens, the transformer-based reconstruction, and several traditional image enhancement algorithms and convolutional neural networks.

High-quality full-color imaging has been a long-standing goal of meta-optics, and this work represents a significant development towards that goal. Besides the simultaneous accomplishment of wide FoV and full-color imaging with meta-optics, this work illustrates the utility of state-of-the-art neural network-based image processing techniques to achieve multi-functional imaging with meta-optics.

References

1. 

W. T. Chen, A. Y. Zhu and F. Capasso, “Flat optics with dispersion-engineered metasurfaces,” Nat. Rev. Mater., 5 604 –620 https://doi.org/10.1038/s41578-020-0203-3 (2020). Google Scholar

2. 

F. Presutti and F. Monticone, “Focusing on bandwidth: achromatic metalens limits,” Optica, 7 624 –631 https://doi.org/10.1364/OPTICA.389404 (2020). Google Scholar

3. 

M. Y. Shalaginov et al., “Single-element diffraction-limited fisheye metalens,” Nano Lett., 20 7429 –7437 https://doi.org/10.1021/acs.nanolett.0c02783 NALEFD 1530-6984 (2020). Google Scholar

4. 

A. Martins et al., “On metalenses with arbitrarily wide field of view,” ACS Photonics, 7 2073 –2079 https://doi.org/10.1021/acsphotonics.0c00479 (2020). Google Scholar

5. 

Y. Liu et al., “Ultra-wide FoV meta-camera with transformer-neural-network color imaging methodology,” Adv. Photonics, 6 056001 https://doi.org/10.1117/1.AP.6.5.056001 AOPAC7 1943-8206 (2024). Google Scholar

6. 

E. Tseng et al., “Neural nano-optics for high-quality thin lens imaging,” Nat. Commun., 12 6493 https://doi.org/10.1038/s41467-021-26443-0 NCAOBW 2041-1723 (2021). Google Scholar

7. 

J. E. Froch et al., “Beating bandwidth limits for large aperture broadband nano-optics,” (2024). Google Scholar
CC BY: © The Authors. Published by SPIE and CLP under a Creative Commons Attribution 4.0 International License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Anna Wirth-Singh, Johannes Froch, and Arka Majumdar "Neural network enables ultrathin flat optics imaging in full color," Advanced Photonics 6(5), 050502 (17 September 2024). https://doi.org/10.1117/1.AP.6.5.050502
Published: 17 September 2024
Advertisement
Advertisement
KEYWORDS
Neural networks

Color imaging

Flat optics

Image restoration

Reconstruction algorithms

Transformers

Color

Back to Top