Correct emittance value is one of the necessary inputs for accurate radiometric temperature measurement. Modern Infrared cameras and/or radiometric software programs typically have default emittance tables built-in allowing the operator to simply select the appropriate material and its corresponding emittance value. Unfortunately, many of these values, while perhaps accurately obtained in a laboratory setting, are typically not appropriate for use in a real-world field environment. There are many reasons for this which include: the deposition of dust, dirt and grease; the unknown thickness of oxide layers; the use of invisible (to the human eye) coatings, the unknown nature of the exact material or alloy; an incorrect value in the table itself due to wavelength or test method; and the effect of surface roughness, geometry, cavity radiation, spatial resolution, angle of view or temperature. In many situations incorrect selection of emittance value results in two miscalculations which can magnify the temperature measurement error significantly: calculation of surface reflectance value which in-turn calculates the amount of background signal to be subtracted from the radiance signal; and correction of the remaining signal attributable to radiant spectral exitance to that of the blackbody equivalent signal determined by the camera calibration. This paper will discuss these issues in depth, provide practical considerations for the field use of emittance, and present a simple method to determine measurement errors due to the unknown variance of emittance.
|