KEYWORDS: Optical coherence tomography, Speckle, Image filtering, Signal to noise ratio, Anisotropic filtering, Nonlinear filtering, Neodymium, Wavelets, Anisotropic diffusion, Human vision and color perception
Speckle reduction in optical coherence tomography (OCT) images plays an important role in further image analysis. Although numerous despeckling methods, such as the Kuan’s filter, the Frost’s filter, wavelet based methods, anisotropic diffusion methods, have been proposed for despeckling OCT images, these methods generally tend to provide insufficient speckle suppression or limited detail preservation especially at high speckle corruption because of the insufficient utilization of image information. Different from these denoising methods, the nonlocal means (NLM) method explores nonlocal image self-similarities for image denoising, thereby providing a new method for speckle reduction in OCT images. However, the NLM method determines image self-similarities based on the intensities of noisy pixels, which will degrade its performance in restoring OCT images.
To address this problem, the Tchebichef moments based nonlocal means (TNLM) method is proposed for speckle suppression. Distinctively, he TNLM method determines the nonlocal self-similarities of the OCT images by computing the Euclidean distance between Tchebichef moments of two image patches centered at two pixels of interest in the prefiltered image. Due to the superior feature representation capability of Tchebichef moments, the proposed method can utilize more image structural information for the accurate computation of image self-similarities. The experiments on the clinical OCT images indicate that the TNLM method outperforms numerous despeckling methods in that it can suppress speckle noise more effectively while preserving image details better in terms of human vision, and it can provide higher signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), equivalent number of looks (ENL) and cross correlation (XCOR).
Image fusion quality assessment plays a critically important role in the field of medical imaging. To evaluate image fusion quality effectively, a lot of assessment methods have been proposed. Examples include mutual information (MI), root mean square error (RMSE), and universal image quality index (UIQI). These image fusion assessment methods could not reflect the human visual inspection effectively. To address this problem, we have proposed a novel image fusion assessment method which combines the nonsubsampled contourlet transform (NSCT) with the regional mutual information in this paper. In this proposed method, the source medical images are firstly decomposed into different levels by the NSCT. Then the maximum NSCT coefficients of the decomposed directional images at each level are obtained to compute the regional mutual information (RMI). Finally, multi-channel RMI is computed by the weighted sum of the obtained RMI values at the various levels of NSCT. The advantage of the proposed method lies in the fact that the NSCT can represent image information using multidirections and multi-scales and therefore it conforms to the multi-channel characteristic of human visual system, leading to its outstanding image assessment performance. The experimental results using CT and MRI images demonstrate that the proposed assessment method outperforms such assessment methods as MI and UIQI based measure in evaluating image fusion quality and it can provide consistent results with human visual assessment.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.