Three-dimensional signal decomposition of infrared image sequences and large-scale non-uniformity analysis
MetadataShow full item record
The three-dimensional noise model is a methodology to analyse the noise of a thermal imaging sensor, such as an infrared (IR) camera. This allows us to decompose a noisy signal into components and quantify properties such as noise equivalent temperature difference (NETD), temporal noise, rain, streaks, or various types of fixed pattern noise. As part of this analysis, it is necessary to identify trends in order to split the data into signal and noise. In this paper we discuss methods to perform this split. We then show that not only the noise, but also the trends contain interesting information and can be used to quantify large-scale non-uniformities in calibrated IR images. We apply this analysis to investigate three different effects that may appear in recorded data: How does the uniformity of the background change when we vary the temperature, the distance, or the lens focus? We have performed a series of laboratory measurements on blackbodies in order to investigate these effects. We find that large-scale non-uniformity may be present even in calibrated images, with an order of magnitude up to ΔT~0:6 K.
Thomassen, Jan Brede; Rheenen, Arthur Dirk van. Three-dimensional signal decomposition of infrared image sequences and large-scale non-uniformity analysis. Proceedings of SPIE, the International Society for Optical Engineering 2020