Translator Disclaimer
Paper
14 August 2001 Optical sensed image fusion with dynamic neural networks
Author Affiliations +
Proceedings Volume 4419, 4th Iberoamerican Meeting on Optics and 7th Latin American Meeting on Optics, Lasers, and Their Applications; (2001) https://doi.org/10.1117/12.437178
Event: IV Iberoamerican Meeting of Optics and the VII Latin American Meeting of Optics, Lasers and Their Applications, 2001, Tandil, Argentina
Abstract
The neural network-based technique for improving the quality of the image fusion is proposed as required for the remote sensing (RS) imagery. We prose to exit information about the point spread functions of the corresponding RS imaging systems combining it with prior realistic knowledge about the properties of the scene contained in the maximum entropy (ME) a priori image model. Applying the aggregate regularization method to solve the fusion tasks aimed to achieve the best resolution and noise suppression performances of the overall resulting image solves the problem. The proposed fusion method assumes the availability to control the design parameters, which influence the overall restoration performances. Computationally, the fusion method is implemented using the maximum entropy Hopfield-type neural network with adjustable parameters. Simulations illustrate the improved performances of the developed MENN-based image fusion method.
© (2001) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Yuri V. Shkvarko, Oscar G. Ibarra-Manzano, Rene Jaime-Rivas, Jose A. Andrade-Lucio, Edgar Alvarado-Mendez, R. Rojas-Laguna, Miguel Torres-Cisneros, and J. A. Alvarez-Jaime "Optical sensed image fusion with dynamic neural networks", Proc. SPIE 4419, 4th Iberoamerican Meeting on Optics and 7th Latin American Meeting on Optics, Lasers, and Their Applications, (14 August 2001); https://doi.org/10.1117/12.437178
PROCEEDINGS
4 PAGES


SHARE
Advertisement
Advertisement
Back to Top