Image color is an important property that contains essential information that can be utilized for conducting accurate image analyses as well as for searching and retrieving images from databases. The colors of an image may be distorted during image acquisition, transmission, and display due to a variety of factors including environmental conditions. Developing an effective and quantitative metric for evaluating the color quality of an image that agrees with human observers is challenging, yet essential for computer vision and autonomous imaging systems. The traditional colorfulness measures are not robust to noise and fail to distinguish different color tones. In this paper, a new nonreference color quality measure CQE, that combines a colorfulness measure and a Uni-Color Differentiation term is presented. This CQE is shown to satisfy the established properties of a good measure, namely: The CQE correlates well with the human perception, which means that the measure can evaluate the quality of images accurately compared to the human observer’s evaluation; The measure is robust to noise and distortions so that it can provide consistent and reliable measure values for a wide range of images; The measure is computationally efficient and can be used in real time applications. The experimental results demonstrate the effectiveness of the CQE measure in evaluating image color qualities for a variety of test images subjected to different environmental conditions, as well as showing its applicability for fast image retrieval for synthetic patches and natural images. Conducting image retrieval by simply searching for the value of the image’s CQE measure is fast, easy to implement, and invariant to image orientations.
Color image quality measures have been used for many computer vision tasks. In practical applications, the no-reference
(NR) measures are desirable because reference images are not always accessible. However, only limited success has
been achieved. Most existing NR quality assessments require that the types of image distortion is known a-priori. In this
paper, three NR color image attributes: colorfulness, sharpness and contrast are quantified by new metrics. Using these
metrics, a new Color Quality Measure (CQM), which is based on the linear combination of these three color image
attributes, is presented. We evaluated the performance of several state-of-the-art no-reference measures for comparison
purposes. Experimental results demonstrate the CQM correlates well with evaluations obtained from human observers
and it operates in real time. The results also show that the presented CQM outperforms previous works with respect to
ranking image quality among images containing the same or different contents. Finally, the performance of CQM is
independent of distortion types, which is demonstrated in the experimental results.
In this paper, we introduce a new spatial domain color contrast enhancement algorithm based on the three dimensional
alpha weighted quadratic filter (3DAWQF). The goal of this work is to utilize the characteristics of the nonlinear filter to
enhance image contrast while recovering the color information. For images with less than desirable illumination, a
modified Naka-Rushton function is proposed to adjust the underexposed or overexposed intensities in the image. We
also present a new image contrast measure called the Root Mean Enhancement (RME) to model Root Mean Square
(RMS) contrast in image sub-blocks. A color RME contrast measure CRME is also proposed based on the RME contrast
in the RGB color sub-cubes. The new measures help choose the optimal operating parameters for enhancement
algorithms, thus improving the practicality for using quadratic filters in image processing applications. We demonstrate
the effectiveness of the proposed methods on a variety of images. Experimental results show that the 3DAWQF can
enhance the image contrast and color efficiently and effectively. Comparisons with existing state of the art algorithms
will be also presented.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.