Robust Perceptual Night Vision in Thermal Colorization

Feras Almasri, Olivier Debeir

Abstract

Transforming a thermal infrared image into a robust perceptual colour visual image is an ill-posed problem due to the differences in their spectral domains and in the objects’ representations. Objects appear in one spectrum but not necessarily in the other, and the thermal signature of a single object may have different colours in its visual representation. This makes a direct mapping from thermal to visual images impossible and necessitates a solution that preserves texture captured in the thermal spectrum while predicting the possible colour for certain objects. In this work, a deep learning method to map the thermal signature from the thermal image’s spectrum to a visual representation in their low-frequency space is proposed. A pan-sharpening method is then used to merge the predicted low-frequency representation with the high-frequency representation extracted from the thermal image. The proposed model generates colour values consistent with the visual ground truth when the object does not vary much in its appearance and generates averaged grey values in other cases. The proposed method shows robust perceptual night vision images in preserving the object’s appearance and image context compared with the existing state-of-the-art.

Download


Paper Citation