Thermal Technology Glossary
Automatic gain control (AGC) is a controlling algorithm for automatically adjusting the gain and offset, to deliver a visually pleasing and stable image that is suitable for video analytics. By deploying different AGC techniques, both rapid and slow scene changes can be controlled to optimise the resulting image regarding brightness, contrast and other image-quality properties. A rapid scene change, that is, a rapid change in the incoming signal levels, could, for a thermal camera, be when something cold or hot enters the scene, for instance a hot truck engine. The corresponding scene change for a visual camera could be when the sun disappears behind a cloud.
AGC also controls whether the output mapping from the sensor's 14-bit signal level to the 8-bit image is done linearly or by using a histogram-equalisation curve. Histogram equalisation redistributes the incoming signal levels, resulting in better image contrast. For example, in a scene with a big flat background and one small but very warm object, a linear curve would waste signal levels that are between the object and the background. The histogram equalisation ensures that the signal levels are only spent on the background and the objects, and not on levels in between.
A primary purpose of a thermal camera is to detect intruders at long distances. When specifying a camera's detection range, Johnson's criteria are used.
During the 1950's, the U.S. military scientist John Johnson developed a method for predicting the performance of sensor systems. He measured the ability of observers to identify scale model targets under various conditions, and came up with criteria, stated in pixels, for the minimum required resolution. With that resolution, there is a 50% probability that an observer can distinguish an object at the specified level: detection, recognition or identification.
For a thermal sensor, the temperature difference between the object and its background needs to be at least 2°C (3.6°F). Objects to distinguish are typically a person, defined with a critical width of 0.75m (2.5ft), or a vehicle, defined with a critical length of 2.3m (7.6ft).
Johnson's criteria were developed under the assumption that visible information is processed by a human observer. If information is instead processed by an application algorithm, there will be specific requirements about the number of pixels needed on the target for reliable operation. All video analytics software algorithms need to work with a certain number of pixels, but the exact number may vary. Even if a human observer can detect the object, the application algorithm often needs a larger number of pixels at a given detection range to work properly.
Outside the visible range of light, we find infrared (IR) and ultraviolet (UV) light, which cannot be detected by the human eye. Conventional, or visual, camera sensors can detect some near-infrared (NIR) light, with wavelengths from 700-1,000 nanometres (nm). If such light is not filtered out, it can distort image colour. Therefore, a visual camera is equipped with a filter - an optical piece of glass placed between the lens and the image sensor. This IR blocking is commonly called an IR-cut filter; it filters out NIR light and delivers the same colour interpretations that the human eye produces.
In some visual cameras, the IR-cut filter can be removed. This allows the sensor to use any available NIR light to produce high-quality, grayscale images in low-light or dark scenes. Such cameras are often marketed as day-and-night cameras or IR-sensitive cameras, but they do not deliver infrared images. Infrared images instead require true infrared cameras that are specialised at detecting long-wave infrared (LWIR) light (heat radiation) which radiates from both living and non-living objects. In infrared images, warmer objects (such as people and animals) stand out from typically cooler backgrounds. True infrared cameras are called thermal cameras.
Like all cameras, a thermal or temperature alarm camera collects electromagnetic radiation and transforms it into an image. But while a visual camera works in the range of visible light (approximately 400-700 nm, or 0.4-0/7 micrometres (μm), a thermal camera is designed to detect radiation with longer wavelengths, typically in either the middle-wave IR (MWIR) domain (approximately 3-5 μm) or in the LWIR domain (approximately 8-14 μm).
All objects with a temperature above absolute zero (0 Kelvin, -273°C or -459°F) emit infrared radiation. Even cold objects, such as ice, emit infrared radiation if their temperature is above -273°C. The hotter an object is, the more thermal radiation it will emit. The greater the temperature difference between an object and its surroundings, the clearer the thermal images will be. However, the contrast of a thermal image does not only depend on the temperature; it also depends on the emissivity of the object.
The emissivity of a material is a measure of its ability to absorb and emit radiant thermal energy. Emissivity can vary between 0 and 1, and is measured as the ratio of the thermal radiation of the material to the thermal radiation of an ideal 'black body' (which absorbs all incident radiation, =1).
The emissivity is highly dependent on the surface properties. Most materials, such as wood, concrete, stone, human skin, and vegetation, have high emissivity (=0.8 or higher) in the LWIR region. By contrast, most metals have a low emissivity (=0.6 or lower) depending on their surface finish; the shinier the surface, the lower the emissivity.
Thermal radiation that is not absorbed by a material will be reflected. The higher the reflected energy, the higher the risk of misinterpreted thermography measurement results. To avoid erroneous readings, it is important to select the camera's measurement angle so that reflections are minimised.
If a material generally behaves like a mirror in the visual spectrum, it usually behaves like a mirror in the LWIR region as well. Such a material may be difficult to monitor, as the temperature reading may be influenced by other objects reflected in the monitored object.
The emissivity is often very dependent on the viewing angle of the surface. A rule-of-thumb is to avoid large glancing angles as the emissivity often decreases with increasing glancing angles. It is also recommended never to set up the camera completely perpendicularly to the surface, in order to avoid specular reflections.
An exposure zone in a thermal image is a defined region of interest. The camera will optimise the image in the set exposure zone only, even when that means that other areas will not be visible at all.
Defining the exposure zone correctly in the scene is crucial and has a major effect on the camera's detection performance. Even in cases when the image seems to look good anyway, there is a risk that histogram colour levels are wasted on uninteresting objects.
A lens (or lens assembly) performs several functions. They include:
- Defining the field of view - how much of a scene should be captured, and at which level of detail
- Controlling the amount of light passing through to the image sensor, so that an image is correctly exposed.
Focusing is done either by adjusting elements within the lens assembly or adjusting the distance between the lens assembly and the image sector.
Several material properties of a camera system are affected by the thermal conditions of the environment. A change in temperature may therefore cause an optical system to defocus.
Since security cameras are usually deployed in environments with large temperature fluctuations, it is important to use optical systems that are not sensitive to thermal changes, and this is especially critical in the infrared wavelength region. Passive, athermalized optical-system design is therefore a necessity for thermal camera security and applications. Depending on the complexity of the optical system, many passive athermalization designs are possible. One example is to match the material of the lens with the material of the optical housing.
Noise equivalent temperature difference (NETD) defines the noise threshold, or the minimum temperature difference that is required for an object to be discerned from the noise. NETD is the most common measure of classifying the performance of a thermal sensor.
The smaller the NETD, the better the sensor. With a NETD of, for example. 50mK (millikelvin), a sensor can detect only temperature differences larger than 50mK, while smaller differences will disappear in the noise.
However, comparing specified NETD values can be problematic, since they may have been calculated using different methods or under different conditions, for example in different ambient temperatures or with different F-numbers. Specified NETD values also may not include spatial noise. This means that the NETD can be low even though the image is quite noisy due to fixed and quasi-fixed spatial noise.
Actual camera performance is affected by many factors other than the NETD value of its sensor, and the best camera does not necessarily have the smallest NETD. For example, NETD does not take into consideration how well in focus a camera is; a camera out of focus can still have a good NETD value. Thus, one thermal camera should not be chosen over another based only on a comparison of their specified NETD values.
Non-uniform correction (NUC) is a smoothing algorithm that compensates for unsolicited functionality variations of the sensor. Fabrication variations are usually large in microbolometer sensors, causing different, non-uniform pixels to represent temperature information differently. Changes in temperature also induces noise, which depicts itself as a spatial variation over the sensor for both offset and responsivity. In addition, there are also differences due to the optical imaging, for example, different fields of view of the pixels. NUC is used for correcting all these differences so that the outgoing signal, corresponding to a homogeneous incoming signal, is as uniform as possible.
Some non-uniformities can be corrected with the help of a movable mechanical shutter, placed between sensor and optics. Depending on the characteristics of the camera system, this shutter is conditionally moved to block the entire field of view, before an image is taken. That image is then included in the NUC algorithm for removing induced noise. The conditions for when a shutter image should be taken vary between algorithms and between camera systems, but are often controlled by an internal temperature sensor or a timer. This image correction is always done at runtime.
Pixel pitch is the distance between two adjacent pixels in the sensor, pixel centre to pixel centre. A smaller pixel pitch generally provides greater resolution. For a constant resolution, however, a decreased pixel pitch means decreased sensor size, which also means that smaller optics can be used. This is especially important for thermal cameras, since their most common lens material, germanium, is very expensive. The drawback of smaller pixel pitch is that each pixel is smaller and thus receives less energy. Ultimately, however, the performance of a sensor depends more on the pixel design than the pixel size.
There are two main types of thermal sensors: cooled and uncooled.
Cooled sensors are high-end, expensive systems often found in military applications. While their performance is widely superior to the uncooled sensors, the price difference is so large that the uncooled sensor is, in practice, the only viable option for the non-military surveillance market. The cooler also needs regular maintenance to maintain its performance over time, which further increases the total cost of ownership for thermal cameras supporting cooled sensors.
The most common type of uncooled sensor is the microbolometer.
A microbolometer is basically a tiny resistor that changes resistance with temperature. By letting the incoming signal heat up the microbolometer and then read out the change in resistance compared to a "blind" microbolometer, a value for the incoming infrared radiation can be created. Each microbolometer constitutes a pixel, and an image is created using an array of microbolometers.
Microbolometers can be made from various materials, but vanadium oxide (VOx) and amorphous silicon (a-Si) are the most commonly used in commercial surveillance.
Both materials have their advantages, and the important sensor parameters are primarily determined not by the sensor material, but rather by the microbolometer manufacturer. Also, the final video output quality of an uncooled thermal camera system is more dependent on the quality of the image processing than on the type of sensor material. Therefore, a system should not be chosen solely because of its sensor material.
Most microbolometer sensor suppliers design their detectors to be sun safe. This means that the sensor is protected from solar radiation by means of shortpass filtering, anti-reflective coating, or electronic design. Still, when a thermal bolometer camera is used to image the sun, there may be temporary visible effects, such as a 'ghost' image of the sun.
The temporary effects disappear after a period of time, the length of which depends on the exposure time and a few thermal camera system properties, mainly lens focal length and shutter correction frequency.
Thermography, or thermal imaging, is a method where infrared radiation is converted to, and presented as, an image. Thermography is a very powerful tool for detecting heat differences. If the thermal camera is calibrated, the thermal image can provide information about an object's surface temperature. When measuring the temperature of a specific surface, the camera is influenced by many other parameters such as the surface's absorption, emission, reflection, transmission, and the heat radiated by surrounding objects.