Paper
18 December 2019 Infrared polarization image fusion via multi-scale sparse representation and pulse coupled neural network
Author Affiliations +
Proceedings Volume 11338, AOPC 2019: Optical Sensing and Imaging Technology; 113382A (2019) https://doi.org/10.1117/12.2547563
Event: Applied Optics and Photonics China (AOPC2019), 2019, Beijing, China
Abstract
Both common information and unique information are included in the infrared polarization (IRP) images and infrared intensity (IRI) images. Aiming at the disadvantages of (1) loss of detail information; and (2) poor discrimination of fused image information, during fusion of IRP images and IRI images, a method of multi-scale sparse representation and pulse coupled neural network is proposed. A non-local means (NLM) fusion methods combined with sparse representation of image and adaptive Pulse coupled neural network (PCNN) is included in the method. Firstly, the non-local means filter is used to obtain the image information of the source image at different scales. Secondly, a non-subsampled directional filter bank (NSDFB) is used to decompose the high-frequency information of different scales into multiple highfrequency direction sub-bands. For multiple high-frequency directions, the spatial frequency (SF) transformation is first performed for multiple high frequency direction sub-bands, and the PCNN is used to obtain the high frequency subbands fused image according to its significance, where the link strength of PCNN is adaptively adjusted by region variance. Then, the joint matrix composed with the low-frequency components is trained by K-singular value decomposition method (K-SVD) to get the redundant dictionary. The common information and unique information are judged by the position information of non-zero values in the sparse coefficient, and are fused with different methods. Finally, the fused high and low frequency sub-bands are inversely transformed by a non-negative matrix to obtain a fused image. Experimental results demonstrate that the proposed fusion algorithm can not only highlight the common information of the source image, but also retain their unique information. Meanwhile, the fused image has higher contrast and detail information. In addition, the fused image performs well in terms of average gradient (AG), edge intensity (EI), information entropy (IE), standard deviation (STD), spatial frequency (SF) and image definition (IDEF).
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Jiajia Zhang, Huixin Zhou, Shun Wei, and Wei Tan "Infrared polarization image fusion via multi-scale sparse representation and pulse coupled neural network", Proc. SPIE 11338, AOPC 2019: Optical Sensing and Imaging Technology, 113382A (18 December 2019); https://doi.org/10.1117/12.2547563
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Infrared imaging

Infrared radiation

Polarization

Associative arrays

Neural networks

Information fusion

Back to Top