Paper
24 October 2017 Visual and infrared image fusion algorithm based on adaptive PCNN
Author Affiliations +
Proceedings Volume 10462, AOPC 2017: Optical Sensing and Imaging Technology and Applications; 1046239 (2017) https://doi.org/10.1117/12.2285118
Event: Applied Optics and Photonics China (AOPC2017), 2017, Beijing, China
Abstract
As the third generation artificial neural network, pulse coupled neural network (PCNN) which consider the characteristics of neurobiology of time coding and spatial accumulation, getting incomparable advantages comparing with the traditional artificial neural network, has broad application prospects in image fusion. In recent years, improving traditional model and adaptive adjustment of key parameters of the model have become major focuses gradually. In this paper, a novel visual and infrared image fusion algorithm is presented based on a new modified PCNN model. The key parameter of linking strength of the model is calculated with the character of the input images adaptively. Firstly, the modified PCNN employs index map and threshold look-up table to improve traditional PCNN model. Threshold look-up table records the thresholds which correspond to the different iteration layers of the modified PCNN model. To improve the computing speed of modified model, the thresholds could be calculated before the modified model starts to compute, which reduces the computing burden of traditional model to get the thresholds. Index map records the firing time of the input image’s pixels during modified PCNN model computing. The values of index map represent the integrating results of similar pixels in space neighborhood of the input image, which reflect the global visual features of the input image. Then, aiding method is used to compute the value of linking strength of modified PCNN model. The linking strength represents the degree that the linking input modulates the feeding input of the current neuron. If the value of linking strength can be decided in accordance with the specific characteristics of the input images, better fusion performance should be gotten in theory. Considering visual image has more detail information of target and infrared image has more energy character of target, local entropy and local energy are combined with the linking strength parameter of modified PCNN model for visual and infrared image separately in the proposed method of this paper. Finally, original visual and infrared image are processed with the modified PCNN model by calculating the linking strength using above procedure. The image fusion rules based on the index maps of visual and infrared image are used to calculate the fusion image. In order to evaluate the performance of the proposed method, a large number of experiments are made. In the experiments, the typical image sets which selected in many related papers are processed with the proposed method and wavelet transform separately. The different fusion images are evaluated with subjective and objective criteria, including the average, standard deviation and spatial frequency. Average stands for average value of pixel’s gray level. Standard deviation manifests that discrete situation for gray level related to average value. Spatial frequency could measure the image details information. The calculated results shows that, compared to methods like wavelet transform, the proposed method can improve the objective criteria values significantly.
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yajun Song, Chen Yang, Zhi Chai, and Jinbao Yang "Visual and infrared image fusion algorithm based on adaptive PCNN", Proc. SPIE 10462, AOPC 2017: Optical Sensing and Imaging Technology and Applications, 1046239 (24 October 2017); https://doi.org/10.1117/12.2285118
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Image fusion

Infrared imaging

Neural networks

Back to Top