I often joke that were it not for emissivity, you’d not need training to use an imager! In our courses we find emissivity is the #1 most confusing issue for people, whether they are engineers, home inspectors or new thermographers. In fact, we all usually have a good laugh that many cannot even pronounce the word. So if you also feel this way about the term or the concept, you are in good company!
Emissivity describes how to quantify the efficiency of a surface for radiating energy in a defined waveband and at a given temperature. Reality says any surface above absolute zero will always radiate some energy (more than 0%), and no surface can radiate perfectly (100%).
When considering infrared radiation most shiny metals emit inefficiently. This means they don’t tell us the thermal truth about themselves! Most non-metal surfaces—paint, paper and human skin, for example—are much more efficient emitters, so it is easy to make a direct connection between what they radiate and their surface temperature. Remember the hot frying pan?
Emissivity values can be determined or measured by engineers. Be aware, however, they are very specific to the material type, surface condition and, especially for metals, the temperature of the material. We can use the values not only to help us understand how a surface might behave but also, in some cases, to correct our radiometric measurements.
Thinking of these values as percentages may help. Human skin, with a value of 0.98, is 98% efficient at emitting thermal radiation while shiny aluminum, with a value of approximately 0.10, emits only 10% of the energy. When we input these values into our imagers, they automatically correct the raw data that had assumed 100% radiation was emitted based on the surface temperature.
As you can imagine measurements using extreme corrections, as are necessary for bare metals, often are unreliable and that is why we strongly recommend making measurements only on surfaces with values greater than approximately 0.6. On metals, the simplest way is to add a high-emissivity “target” of paint or electrical tape.
When you input a correction, the resulting changes are made to the entire image. Obviously, because of this fact, separate corrections must be made for each different point we want to measure. While some models allow these corrections to be made in the imager itself, the good news is they can also be made in the software to a stored image. All changes can be undone or “tweaked” to better match reality.
I’d encourage you to practice making measurements on various surfaces that are at least 10oC (18oF) warmer or colder than the surroundings—windows, coffee cups, skin, etc. Compare what you measure on the surface with what you measure on a high-emissivity target (use electrical tape with an emissivity correction of 0.94). Notice, too, what happens as you adjust the emissivity correction for these surfaces—the image doesn’t change, but the corrected temperature values do! Next week we’ll talk about what is being reflected from the surface and how to correct for that.
Don’t expect everything to become crystal clear immediately, but you should quickly find out emissivity is not as confusing as it may have seemed to be. Here are two good guidelines:
- Radiometric temperatures of bright metal surfaces will be unreliable. Use high-emissivity targets whenever possible.
- Radiometric temperatures of nearly all other surfaces in nearly all instances will be quite reliable.
John Snell—The Snell Group, a Fluke Thermal Imaging Blog content partner