Next Article in Journal
A Hybrid Robust Image Watermarking Method Based on DWT-DCT and SIFT for Copyright Protection
Next Article in Special Issue
Automated Data Annotation for 6-DoF AI-Based Navigation Algorithm Development
Previous Article in Journal
Flexible Krylov Methods for Edge Enhancement in Imaging
Previous Article in Special Issue
Optimizing the Simplicial-Map Neural Network Architecture
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Modern Thermal Imaging Sensor Technology and Applications for Autonomous Aerial Navigation

1
School of Engineering, University of South Australia, Mawson Lakes 5095, Australia
2
Aerospace Division, Defence Science and Technology Group, Edinburgh 5111, Australia
3
Joint and Operations Analysis Division, Defence Science and Technology Group, Melbourne 3000, Australia
*
Author to whom correspondence should be addressed.
Submission received: 15 July 2021 / Revised: 30 September 2021 / Accepted: 9 October 2021 / Published: 19 October 2021
(This article belongs to the Special Issue Formal Verification of Imaging Algorithms for Autonomous System)

Abstract

:
Limited navigation capabilities of many current robots and UAVs restricts their applications in GPS denied areas. Large aircraft with complex navigation systems rely on a variety of sensors including radio frequency aids and high performance inertial systems rendering them somewhat resistant to GPS denial. The rapid development of computer vision has seen cameras incorporated into small drones. Vision-based systems, consisting of one or more cameras, could arguably satisfy both size and weight constraints faced by UAVs. A new generation of thermal sensors is available that are lighter, smaller and widely available. Thermal sensors are a solution to enable navigation in difficult environments, including in low-light, dust or smoke. The purpose of this paper is to present a comprehensive literature review of thermal sensors integrated into navigation systems. Furthermore, the physics and characteristics of thermal sensors will also be presented to provide insight into challenges when integrating thermal sensors in place of conventional visual spectrum sensors.

1. Introduction

Research on unmanned aerial vehicles (UAVs) has grown rapidly in the past decade. First, initially developed for military purposes [1], UAVs have been widely used in many applications including industrial inspection [2,3], remote sensing for mapping and surveying [4,5], rescue missions [6,7,8,9,10,11], border control [12] and for other emerging civil applications.
Reliable navigation for autonomous or semi-autonomous operation is essential for these applications. Currently, UAVs rely heavily on an array of sensors for its navigation. Various navigation techniques can be divided into three groups: inertial navigation, satellite navigation and vision-based navigation [13]. The global positioning system (GPS), inertial measurement units (IMU) and barometers are primarily used for determining position, attitude and velocity of the aircraft. However, GPS is known for errors and drop-outs [14] due to signal loss and interference in forests, under tall buildings, in narrow canyons or in remote areas at particular times. IMUs provide a limited period of accurate positioning after external aiding is lost, as they drift without bound from integrating cumulative errors over time [15].
Vision-based navigation systems are a promising research direction in the field of autonomous navigation. Vision sensors can provide real-time information about a dynamic surrounding environment that is resistant to conventional jamming. Vision sensors detect reflected photons or radiated photons in specific bands across the electromagnetic spectrum. Optical sensors perform detection in the visible spectrum that humans can see, while thermal sensors detect infrared wavelength that is invisible to humans.
The predominance of research to date considers optical sensors that require some form of illumination of the scene. There is a substantial gap in the ability to navigate at night given that it has the potential to increase the operational period of vision systems.

2. Navigation Problems with Thermal Sensors

Although thermal cameras have been used in visually degraded conditions before, they were mainly used for purposes other than navigation, including: inspection [16,17,18,19,20], crop monitoring and water management in agriculture [21,22,23,24,25]. The main complication that prevents their usage for navigation in natural scenes includes limited features such as edges or textures compared to their visible band counterparts [13]. Furthermore, early versions of thermal sensors included built-in internal corrections that dynamically changed the contrast in the images before output, violating many vision algorithm requirements.
Additionally, early sensors were large, preventing their use on small UAVs. There were also limitations of availability of small and powerful on-board processing hardware. Due to these constraints, many navigation algorithms were initially designed for unmanned ground vehicles (UGV) rather than UAVs. Later, thanks to the introduction of smaller thermal sensors and more capable processing hardware, thermal UAV navigation techniques began to attract interest.
The number of research articles published on this topic has increased in recent years due to the availability of thermal sensors and robotics technology combined with navigation challenges in new applications. However, no single review has yet summarised the relevant articles with a focus on the integration of thermal sensors into navigation systems of UAVs.

2.1. Aims and Search Methodology

Considering the observations above, we provide a review with a focus on the integration of thermal sensors for navigation applications within the last decade, from 2010 to the present period. Our paper addresses the hierarchy of issues for a thermal sensor in a navigation system, including the fundamental physics of operation, sensor configurations and computational aspects.
Our method for identifying all relevant papers to include in this study is by using keywords from google scholar and the university library database: “navigation”, “thermal imaging”, “long wave infrared”, “GPS denied”, “deep learning” and “vision-based techniques.” The range of papers according to their year of publication is from 2010 to 2021. The selected articles were then divided into different categories based on the type of algorithms used. Furthermore, sensor specifications and configuration aspects will be discussed in order to analyse which navigation applications can be achieved and the performance of each system.

2.2. Structure of the Paper

The paper is organised into thirteen sections. Section 3, Section 4 and Section 5 will focus on the development of commercial thermal sensor technologies, the physics concepts and the sensor configurations for different navigation applications.
Section 3 considers the thermal sensor developments in the last 10 years, from the oldest to the most recent studies. Section 4 introduces the fundamental concepts behind the electromagnetic and infrared spectra. After that, Section 5 will highlight some important features of thermal sensor configurations, including sensor calibration and the relevant aspects of built-in correction techniques.
After discussing hardware characteristics of thermal sensors, Section 6 presents the basic concepts of different algorithm types for vision-based systems. Section 7 presents works in Simultaneous Localisation and Mapping (SLAM). Section 8 presents works in optical flow, and Section 9 reviews works in neural networks.
Section 10 discusses the various roles used for thermal sensing in navigation, while Section 11 describes the difference in system requirements for the thermal image in different navigation approaches.
Section 12 and Section 13 present the discussion and our observations about future research directions of the field.

3. Thermal Sensor System Considerations for Navigation Applications

Although the history of thermal sensor technologies is well described in [26], it is worth considering the sensors in the context of navigation sensing in small to medium autonomous vehicles, particularly those that are airborne. This section will discuss the specifications of different thermal sensors that are suitable for navigation application, including cooled and uncooled sensor technologies, dimensions, weights, power consumptions, resolutions and effective frame rates.
Table 1 shows the specifications of all of the thermal sensors appearing in work we have reviewed from the last 10 years for navigation applications.

3.1. Cooled and Uncooled Sensor

A major practical classification of thermal technologies is cooled vs. uncooled sensors. A cooled thermal sensor has an integral cooling system to lower the sensor temperature to cryogenic temperatures (120 K or −153 ° C) in order to achieve a higher signal to noise ratio (SNR), thereby allowing higher thermal sensitivity, higher spatial resolution and higher frame rates. However, cryocoolers typically contain mechanical parts, produce far more heat on the other side of circuit, which contributes to larger size and weight, and reults in high power consumption of the imaging device. These characteristics might be tolerable in large vehicles, but such devices are likely to exceed space, weight and power (SWAP) capacities of smaller multirotor and hand launched drones.
Uncooled sensors, on the other hand, are smaller in both size and weight at the cost of inferior all-around performance. However, the study in [31], which compared a high-end cooled FLIR Phoenix to the more affordable uncooled system Variocam, showed that the uncooled thermal system could compensate for its lower resolution sensitivity via further image processing. Furthermore, the study also showed that the uncooled thermal sensor was still suitable for use in UAVs for navigation applications by analysing SNR data. It has been observed that all of the thermal sensors used for commercial mass-market purposes in the last decade have been uncooled.

3.2. Sensor Specification Constraints for Unmanned Platforms

Earlier thermal sensors were heavy and bulky. The earliest study consisted of a 4.54 kg thermal sensor and was conducted on UGVs where size and weight were tolerable. For small and medium commercially available quadrotor drones, the recommended payload limit is usually less than 800 g. For example, the DJI Phantom 3 and Phantom 4 can safely carry a payload of 700 g and 800 g, respectively [32]. When considering the entire payload, it will include other components such as onboard computers, batteries and other sensors and the size and weight of the thermal sensor alone that require substantial reduction from early airborne implementations.
Other factors include effective frame rate, resolution and power consumption, which are related to each other. Higher frame rate and resolution translate to higher power consumption for the system overall. Higher frame rate and resolution require more computational effort from the onboard computer combined with higher power demand from the sensor itself, resulting in a bigger battery and a bulkier system. For highly dynamic systems, however, there is no substitute for frame rate.
In addition to flying platforms, when selecting a suitable sensor, the type of navigation algorithm is also a crucial factor. Map building algorithms such as SLAM require higher frame rate, higher resolution and are more computationally expensive. While the map-less techniques do not require very high resolution and frame rate and are cheap to execute, they tend towards navigation in relative terms rather than absolute. The details of the two approaches will be presented in Section 6.
Recent sensors from FLIR such as the FLIR Tau2, Boson and Lepton have weights and sizes significantly lower while still maintaining resolution and frame rate. Furthermore, the prices of the recent thermal sensors such as the FLIR Boson and the FLIR Lepton have become more attainable for research compared to the previous generation of sensors. Consequently, more researchers have begun to integrate thermal sensors into UAV navigation systems.

3.3. Platform Considerations

Different UAVs fit different missions, which requires different system configurations, size and algorithms [33,34]. For example, for a search and rescue or survey mission in a confined area such as underground in mines or in urban areas, a multirotor is a good choice due to its size and maneuverability. The multirotor tends to be smaller than fixed-wing aircraft and is not used for long distance transits. They have a major advantage in their ability to stop and hover, which is a very specific navigation state resulting in behaviour focused on maintaining a stationary image and constant height.
On the other hand, for missions such as border control, search or survey over large open areas, a fixed wing aircraft can be a better choice than a quadrotor due to its long operating time and superior range. In defense applications, fixed wing aircraft are more often used due to their long range and endurance, high operating ceiling and the ability to carry large payloads of sensors and weapons. Fixed wing UAVs require more complicated take off and landing procedures, are more challenging to operate, are less compact and range in size from that of a human hand to a passenger airliner. With their need to stay in forward flight and requirement to fly longer distances, their navigation behaviour is focused on trajectories, wind and forward movement.

4. Physics of Thermal Sensors

Thermal sensors operate differently than optical sensors. Unlike optical sensors, thermal sensors rely on emissions of longer wave infrared radiation rather than reflection of shorter visual wavelengths. While thermal infrared wavelengths can reflect off surfaces and images captured using thermal sensors can be somewhat affected by the surrounding environment, this is a minor effect in general. Given the emission driven mode of the sensor, it is essential to examine the physics of thermal sensors and emissivity values of different materials.
This section briefly introduces the fundamental concepts behind infrared sensors, including black body radiation theory, the electromagnetic spectrum and the emissivity value of different materials. All related concepts are well described in [35,36].

4.1. Black Body Radiation

A black body radiator is an object at near thermodynamic equilibrium that absorbs and emits all radiation frequencies [37]. At near thermodynamic equilibrium, the emitted radiation or thermal radiation can be expressed by Planck’s law [38]. Planck’s law expresses the spectral radiation emitted by a black body at thermodynamic equilibrium:
β ν , T = 2 ν 3 c 2 1 e ( ν / k T ) 1
where
  • β ( ν , T ) is the spectral radiance of the object at temperature T(K) and frequency ν ;
  • is the Planck constant;
  • c is the speed of light in vacuum;
  • k is Boltzmann’s constant;
  • ν is the frequency of the electromagnetic radiation;
  • T is the absolute temperature of the object.
When the temperature of a black body is at several hundred degrees Kelvin, most of the emitted radiation is infrared. When the temperature is higher, it is emitted at shorter wavelengths which are in the visible region.

4.2. Electromagnetic Spectrum

Figure 1 shows different wavelengths in the electromagnetic spectrum. Radio and microwaves lie at the longer end of the spectrum of electromagnetic energy, while gamma ray and X-rays have very short wavelengths. Humans can only see a limited range of the spectrum from 380 nm to 700 nm [39].
Infrared radiation was discovered in 1800 by William Herschek [41]. Infrared is the part of the electromagnetic spectrum from 8 to 15 μ m [42]. Most of the energy in this spectrum is radiated as heat and can be observed both during the day and night. Since the infrared spectrum has longer wavelength than visible light, it is less attenuated by denser mediums such as vapour, dust or smoke [43]. This paper will focus on applications of the infrared spectrum.

4.3. Emissivity

Emissivity of an object is a measurement of its ability to emit thermal energy [44]. Emissivity of 0% is a perfect thermal mirror that reflects all infrared energy, and 100% is a black body that absorbs and radiates all energy [38].
Table 2 shows the emissivities of some objects, both metal and non-metal, including polished or oxidised/roughened metal. It is clear that polished non-oxidised materials have lower emissivity values compared to oxidised materials. Non-metallic materials such as glass and water have a high emissivity value; thus, infrared wavelengths do not penetrate glass or water.
It is also apparent from Table 2 and the principle of black body radiation that thermal imaging is substantially different from optical imaging. The low emissivities of some manufactured surfaces but relatively high emissivities of natural surfaces show that thermal imaging devices will tend to observe scenes through radiation rather than reflection. Objects radiate energy absorbed from the sun earlier and reflect thermal radiation for other objects and the ground at quite modest levels unless they are finished metal surfaces. Thermal scenes are usually equivalent to scenes composed of light sources if they were in the optical domain, which has implications for how and why they are used for navigation.
Figure 2 shows an example from FLIR [46] showing that different materials, metal and non-metal, emit infrared radiation due to different emissivity values at the same temperature.
In addition to the material of emitting objects, other factors that should be considered include the temperature, humidity, reflective surfaces and convection airflow of the intended operational environments [47]. This has substantial implications for navigation in natural environments. Cloud, rain and dust are likely to affect optical and thermal cameras.
In vision-based navigation algorithms, textures, corners and edges in the image are vital requirements. Hence, a thermal system will not provide much information in a uniform temperature scene, such as water, snow or sand. Conditions with little temperature variation between day and night and with minimal solar radiation during that day might have very little thermal contrast in scenes that are imaged even if the scenes are not a uniform material.
The study by [48] utilised this concept by applying thermal markers made of thin 5 mm acrylic sheets over objects in smoke filled environments. Due to high emissivity values of acrylic at 0.88 [49], the thermal markers are distinct in the thermal frame, which later could be used for map-building SLAM techniques.

5. Thermal Sensor Configurations

Sensor configurations have a direct effect on how data are collected and processed. This section details the crucial concepts of thermal sensor calibration and their impact on vision-based navigation. Problematically for autonomous system applications, modern thermal sensors have external and internal correction techniques that are designed for human visualisation and inspection purposes rather than computer vision, and this difference and its implications are discussed in this section.

5.1. Sensor Calibration

Geometric calibration is an essential task when using a combination of cameras in navigation applications. Chessboard is a standard method for optical camera calibration [50,51,52], but when translated to thermal cameras, heated lamps or using material with different emissivity values, it not only makes the task more complex but also inaccurate and expensive [53,54].
The authors in [55,56] found that the calibration results significantly vary between tested thermal cameras. Furthermore, cameras have shown very large decentering distortions and deviations in both image coordinate axes.
A trinocular vision system [57] of one optical camera and a thermal camera was calibrated with a plastic board with 25 holes with 25 miniature bulbs arranged at each hole to emit heat and light. This method successfully obtained the required calibration parameters for the thermal sensor for their application. Vida et al. proposed that a geometric mask-based approach could obtain an improvement of 78% compared to the conventional method of a heated chessboard [58]. The work described an A2 board that was built from Medium Density Fibreboard (MDF) due to its good thermal insulation properties, and although it required substantial warming time, the part was laser cut with 40 × 40 mm squares across the board.

5.2. Re-Scaling and Correction Techniques

Thermal infrared sensors typically sample radiometric data at 14 bit [59] resolution, while modern electronic display standards and many computer vision libraries require an 8 bit input source. As a result, many previously proposed approaches either rely on Automatic gain control (AGC) or normalisation to convert 14 bit raw data to 8 bit.
When converting from 14 bit to 8 bit format, there is a 6 bit loss of information and a reduction in dynamic range which degrades performance in many circumstances. The study in [60] shows that, while using the same algorithm, using full radiometric thermal information as inputs produced better performance than a re-scaled version. They also showed that using re-scaled data may accumulate errors over time, resulting in thermal data being presented incorrectly.

5.2.1. Automatic Gain Control

Many thermal sensors rely on Automatic Gain Control (AGC) to convert raw radiometric data to usable 8 bit input for modern standard displays. The main purpose of AGC is to improve the contrast of an image based on the radiometric range in an observed scene when converting to 8 bit depth. The AGC technique is usually applied by default when there is a change in scene temperature due to hot or cold objects exiting or entering the field of view. A practical example shows that it comes at a cost: A substantial change in overall brightness in two consecutive frames is observed when a hot cup moves into a scene, as shown in Figure 3. The contrast change is likely to cause problems for many image matching techniques that are fundamental to visually aided navigation due the drastic changes in image appearance and the likelihood that some information might be discarded between frames. Although 8 bit processing seems like an arbitrary concern, the vast amount of software is written with 8 bit processing in mind, and hardware acceleration libraries make it a non-trivial consideration.
To solve this problem, some work-around methods have been proposed. The authors in [61] reduced the response time of the AGC so that the brightness does not change rapidly in order to perform matching of the features. However, this approach only reduces the problem and does not completely solve it. It also creates errors in image processing techniques that use spatiotemporal gradient information. Another approach is to set the range for the AGC while operating via some external means or manually [62]. Nevertheless, this requires prior knowledge of the environment or a new set of scene analysis logic, making it less adaptable to unknown environments.
Rosser et al. re-scaled one pair of 14 bit images at a time to work with feature detection for optical flow [63]. The technique works particularly well in OpenCV [64] by using Shi–Tomasi corner detection [65] and the Lucas–Kanade optical flow technique [66]. The technique is based on the maximum and minimum pixel intensities in a pair of images, which will be explained in Section 8.1.

5.3. Flat Field and Non-Uniformity Corrections

A correction technique to fix accumulated pattern noise over time in thermal sensor systems is Flat Field Correction (FFC) for sensors with a shutter or Non-Uniformity Correction (NUC) for sensors without a shutter. During operation, the FFC/NUC freezes the sensor for a small amount of time (0.3–2 s), depending on the camera model, in order to correct for errors. This operation is essential for stationary applications where the sensor captures the same scene for a long time. However, it comes with a downside of occasional data interruption which is potentially undesirable for navigation applications. Vidas et al. designed a thermal odometry system that performed NUC only when needed depending on the scene and pose [67].
On the other hand, in some recent sensors such as the FLIR lepton 3.5 [59], a built-in internal calibration algorithm that is capable of automatically adjusting for drift effects can compensate for FFC/NUC for moving applications. As described in studies in [63], the FFC was not needed since the sensor was mounted on constantly moving aircraft.

6. Vision-Based Navigation Systems

Vision-based systems rely on one or more visual sensor to acquire information about the environment. Compared to other sensing systems such as GPS, LIDAR, IMUs or conventional sensors, visual sensors obtain much more information such as colours or texture of the scene. The available visual navigation techniques can be divided into three categories: Map based, Map building and Mapless systems.

6.1. Map Based Systems

Map based systems rely on knowing the spatial layout of the operating environment in advance. Hence, the utility of this type of system is limited in many practical situations. At the time of writing, there is no proposed work with thermal cameras.

6.2. Map-Building Systems

Map-building systems build a map while operating, and they are becoming more popular with the rapid advancement of SLAM algorithms [68]. Early SLAM systems relied on a system of ultrasonic sensors, LIDAR or radar [69]. However, this type of payload limits their use in small UAVs. Therefore, more researchers have shown interest in single and multiple camera systems for visual SLAM. Related works will be presented in Section 7.

6.3. Mapless Systems

A mapless navigation system can be defined as a system that operates without a map of the environment. The system operates based on extracting features from the observed images. The two most common techniques in mapless systems are optical flow and feature extracting techniques. The related works will be presented in Section 8.

7. Simultaneous Localisation and Mapping

Simultaneous Localisation and Mapping (SLAM) is a mapping technique for mobile robots or UAVs to generate maps from operating environments. The generated map is used to find the relative location of the robot in the environment to achieve appropriate path planning (localisation). The first SLAM algorithm was introduced in [70], where they implemented the Extended Kalman Filter technique EKF-SLAM.
In early works, many different types of sensor such as LIDAR, ultrasonic, inertial sensors or GPS were integrated into the SLAM system. Montemerlo et al. [71] proposed a technique named FastSLAM, a hybrid approach utilising both the Particle Filter and Extended Kalman filter techniques. The same team later introduced a more efficient version: FastSLAM2.0 [72]. Dellaert et al. [73] proposed a smoothing technique called Square Root Smoothing and Mapping (SAM) that used the square root smoothing technique to solve the SLAM problem in order to improve the efficiency of the mapping process. Kim et al. [74] proposed a technique based on unscented transformation called Unscented FastSLAM (UFastSLAM), which is more robust and accurate compared to FastSLAM2.0.
Recently, SLAM system using cameras are actively explored with the hope of achieving reduced weight and system complexity. Since SLAM takes only visual images as input, it is referred to as visual SLAM (vSLAM) [75]. Various low computation techniques have been proposed in the literature that are suitable for UAVs with limited resources onboard. A typical SLAM application for small UAVs is visual odometry.

7.1. Combined Spectrum Techniques

In early works, some authors tried to utilise both the LWIR and visible spectra to enhance or mitigate features that were hidden due to external factors such as fog or smoke.
Maddern and Vidas [76] proposed a technique to combined 8 bit thermal with RGB images for UAV navigation. The study showed that there are extreme changes in the visible spectrum during the day and night, while the thermal spectrum remains consistent but with lower contrast over time. The combined spectrum produced better results with algorithms that used visual or thermal frames alone. Poujol et al. [77] showed that combining visual and thermal spectrum can greatly improve the performance of classic visual odometry approaches. The study used two image fusion techniques: monochrome threshold based image fusion [78] and monocular visual odometry [79]. The data were collected from an electric vehicle moving around a city. The experimental results show that the fused images could provide extra data to achieve more robust solutions.
Brunner et al. [80] presented a preliminary evaluation study of combining optical and thermal cameras for localisation in an environment filled with smoke or dust for autonomous ground vehicle (AGV). The study showed that relative motions could not be estimated from visual images in that environment, while motions can be estimated from thermal images but with less accuracy. The authors in [81] proposed a technique to combine both LWIR and the normal spectra in order to enhance a VSLAM algorithm by rejecting low quality images that may have introduced localisation errors. The technique was tested in several adverse conditions such as smoke, fire, at dusk and in low light conditions that have unfavourable effects on both the thermal and visual spectra.
A flexible SLAM network described in [82] utilised both thermal and visual information to build a colour map of the environment under low illumination environments. Multispectral stereo odometry from optical and thermal sensors was introduced in [83] for a ground vehicle. Khattak et al. [84] relied on a combination of radiometric sensors, the FLIR Tau2 and a visual camera to create the navigation capacity for a small quadrotor in an indoor dark and dust filled environment. An Intel NUC-i7 computer (NUC7i7BNH) was installed in the UAVs to perform all the calculation tasks onboard. Thermal frames enabled robust feature selection combined with an Extended Kalman Filter for odometry estimation by the drone. The study showed that the thermal sensor helped the fusion system to work reliably in low visibility environments.

7.2. Thermal Spectrum Techniques

This section presents work and algorithms that use thermal sensors as the only source for collecting data, which can be divided into two categories: techniques that use 8 bit re-scaled data or work that makes use of higher bit depth radiometric data.

7.2.1. Re-Scaled Data

Mouats et al. [61] also developed a thermal stereo-odometry for UAV applications based on localisation solutions from a pair of thermal images. The authors used a pair of re-scaled 8 bit images with applied AGC and the FFC turned off. To compensate for the sudden change in contrast from the AGC, they employed a technique to reduce response time for the AGC so that the matching feature algorithms could still function. However, this made the algorithm less adaptive to the environment.
Another study by Khattak et al. [48] used a LWIR sensor alone to detect low thermal conductivity fiducial markers in order to localise in a dark indoor scene. The team attached a thermal fiducial marker to fixed objects around the environment in an incremental manner. The new marker was observed at the same time as previously predefined ones. The poses and the coordinates of the platform estimated from this method showed it to be on par with the ground truth Inertial Measurement Unit (IMU).
The ROVIO [60] algorithm was shown to work well with re-scaled 8 bit images in indoor environments. The algorithm was modified to work with full scale radiometric data, named ROTIO. The ground truth was provided by a motion capture system. The result shows the advantages of using full radiometric data. The FFC was turned off to prevent tracking loss due to data interruption.

7.2.2. Full Radiometric Data

Shin and Kim were the first to propose a thermal-infrared SLAM system using measurements for 6-DOF motion estimation from LIDAR on full radiometric 14 bit raw data [85]. The experimental results show that the 14 bit system overcame the limitation of the re-scaling process and was more resilient to data loss. Moreover, relying on full radiometric data, Khattak et al. [86] proposed a thermal/inertial system that utilised the full range of radiometric data for odometry estimation. The study showed that using full radiometric images was more resilient against loss of data due to sudden changes caused by the AGC re-scaling process.
Although the previous works show promising outcomes, the SLAM algorithms are computationally demanding and many require high resolution thermal images. Many aforementioned works use high resolution thermal cameras such as the FLIR Tau2, which costs thousands of dollars. Furthermore, a compact yet powerful onboard computer system is also expensive in terms of money as well as space, weight and power. All of these are difficult challenges for integration into small UAVs.

8. Optical Flow

Optical flow is a map-less measurement technique defined as the pattern of apparent movement of brightness across an image [87]. Optical Flow can be used in navigation solutions that have been inspired from insects such as the honeybee [88]. The honeybee navigation system relies on optical flow for graze landing [89,90] and detecting obstacles avoidance [91]. Unlike SLAM, optical flow algorithms require much less computational resources and do not require very high resolution input images. Additionally, optical flow algorithms, such as the sparse Lucas–Kanade technique in OpenCV, are known for their efficiency and accuracy for many applications [63,92,93,94,95,96,97]. Hence, optical flow based systems can satisfy both weight and size constraints for integration into small UAV navigation systems.

8.1. Thermal Flow

The term “Thermal FLow” (TF) applies to LWIR-based flow sensing. Rosser et al. proposed a technique to calculate optical flow from re-scaled 8 bit thermal data [63]. Optical flow estimation operates based on several assumptions, including brightness consistency across two images. However, due to the effect of the AGC when re-scaling to 8 bit, there is a violation of this crucial requirement. Hence, the authors came up with a technique to preserve contrast for each pair of images as shown in Figure 4.
In their works, thermal flow relied on the Pyramid Lucas–Kanade [66] algorithm in OpenCV [98]. Good tracking points were found based on the Shi–Tomasi corner detection algorithm [65]. Table 3 shows settings for parameters of the LK and tracking functions in OpenCV.
They built a low-cost system consisting of a Raspberry Pi with a low-resolution radiometric FLIR sensor Lepton 3.5. The low cost system was designed to mimic the operation of the PX4Flow [93], a low cost and light weight optical flow sensor system, as shown in Figure 5. The sensor was designed to produce reliable two dimensional flow vectors for small hovering platforms such as pocket drones [99,100,101], quadrotor [99,102] and small fixed wing aircraft [92]. Due to the PX4Flow sensor operating in the visual spectrum, its operation is heavily compromised in low light conditions but it was useful as a ground truth to evaluate the thermal flow sensor performance during the day.
Rosser et al. [63] mounted the payload on a fixed wing aircraft to reduce lateral drift. The aircraft took two flights, one during the day with sufficient illumination and one later in the night after dark. The results showed that the TF system operated equivalently compared to the optical flow sensor during daylight while also being able to operate at night. Figure 6 shows the results from this study.
The results in Figure 6a show that thermal flow works comparatively well compared to optical flow in the X and Y directions. The third plot shows the Manhattan displacement for both, which show that the signals from OF and TF are closely matched. The fourth plot indicates a strong normalised cross correlation value of 0.86 showing the two signals represent the same phenomenon.
The results in Figure 6b show that TF works to some degrees at night, and OF obviously does not work. The Manhattan displacement and low cross correlations signals at 0.3 indicate that the signals are no longer similar. It can be concluded that TF performs well during the day and still works to some degree during the night.

9. Deep Learning

Deep learning and neural networks are the primary choice in many vision tasks since the introduction of the convolutional neural network (CNN) architecture, AlexNet [103]. AlexNet won the large scale visual recognition competition (ILSVRC) by a large margin. As a result, many researchers have since applied CNNs to many computer vision tasks with great success.

9.1. Thermal Image Enhancement

Choi et al. proposed a neural network to enhance low resolution thermal images [104]. The network was inspired from a RGB counterpart from [105] but with much less computational demand. The network consisted of three convolutional layers to extract a set of feature maps, followed by the last layer to combine the predictions to reconstruct the high resolution output.
The study used the RGB training dataset from [106] for the entire training process. During the testing phase, a test dataset consisting of both RGB and thermal images from [107,108,109] was used. The model was evaluated in three different scenarios, pedestrian detection, image registration and visual odometry, which showed that the proposed technique was not only capable of reproducing higher resolution images but also with lower noise and fewer unwanted artifacts.

9.2. Deep Learning Neural Network Based Odometry

Saputra et al. [110] are the first to propose a DNN-based odometry architecture using thermal images as input. They proposed a DNN-based method for thermal-inertial odometry (DeepTIO) using hallucination networks. They modified an existing state-of-the-art neural network that uses visual images as input into a Visual/Inertial Odometry (VINet) model [111] combined with selective fusion [112]. Since radiometric data contain only one channel, two extra channels were duplicated from the first one, resulting in three channels for the neural network. To provide missing information, a hallucination network was implemented to provide complementary information. The model consisted of the following: a feature encoder, a selective fusion module and a pose regressor.
Hand-held thermal data were collected with FLIR E95 operates at 60 fps with 464 × 348 resolution to collect data in five different buildings, some filled with smoke. Furthermore, a FLIR Boson with 640 × 512 resolution, placed on a mobile robot operated in different testing rooms with various obstacles and lighting conditions. The results show that DeepTIO outperformed VINet in most scenarios. In particular, the performance of VINet decayed when there was insufficient illumination while DeepTIO could still produce reliable and accurate trajectory. However, the DeepTIO network could only work well at 4–5 frames per second; anything lower or higher resulted in a decrease in accuracy. While this is not explained, this indicates some effect caused by camera noise and change in images caused by platform motion.

10. Roles of Thermal Sensors in Navigation Systems and Applications

This section discusses the role of thermal sensors in the literature in order to explore how thermal systems have evolved in the last 10 years.
Early use of thermal sensors in robotics saw sensors mounted on UGVs due to their size and weight. The earliest relevant paper was conducted on a UGV [80]. Later, with the introduction of lighter and smaller sensors, UAVs have been the primary choice throughout. Ultimately, there are cheaper and higher resolutions options for UGVs, such as carrying illumination (head lights) for optical sensors.
Initially, researchers explored the possibility of combining both the visual and infrared spectra in normal conditions where the optical sensor would still operate. The role of thermal sensors in this case was to identify and reject bad batches of data [76,81]. Later, the thermal sensor system was integrated to be used in reduced illumination conditions when optical sensor performance was degraded. Thermal sensors played a bigger role in this scenario where thermal data were fused with optical data to construct 3D environments and visual odometry [82,83]. Additionally, thermal data were used to correct and compensate for missing data from the optical sensor due to lack of illumination in the environment [84].
Later, thermal sensor based navigation systems were experimented on in total darkness or visual degraded environments, such as rooms filled with smoke and dust where the optical sensor could not reliably capture the scene. In this case, thermal data were the only input for navigation algorithms. At first, re-scaled 8 bit thermal data were utilised for stereo odometry [61,83], SLAM [48] and optical flow [63]. The disadvantages of 8 bit re-scaled data have been discussed in Section 5.2.1.
In deep learning, neural networks and mapless algorithms [63,104,110], only 8 bit data were used, and no work has been performed to explore the possibility of full radiometric data in this field yet. It is likely that this is partly driven by the availability of appropriate software platforms.
With the introduction of radiometric sensors, some groups have been experimenting with adapting their works to include full resolution thermal data at 14 bit. Results in in [60,85,86] have shown that full radiometric 14 bit thermal data can improve the consistency of the algorithm over time and overcome the need for AGC correction techniques.

11. Navigation Approaches with Respect to System Configuration

There are two main navigation approaches found in this survey of the literature: VSLAM and variations on odometry. The former can provide an absolute navigation solution, and the latter provides a dead reckoning solution that is relative to a starting point. The former, when it fails, is likely to result in catastrophic navigation failure with the navigation filter diverging or finding a wrong solution (in the absence of other safeguards), and the latter inexorably drifts over time.
Due to the nature of map-building techniques, higher resolution and frame rate thermal sensors are more desirable due to the need to detect specific features. Hence, it has been observed that map-building techniques relied on better thermal sensors than map-less techniques.

11.1. VSLAM

Brunner et al. [80] conducted a study to investigate the use of thermal images to compensate for missing and erroneous information in RGB images due to external adverse lighting effects. The same group in [81] later developed a framework to extract matching features from thermal data. Thermal data could compensate for the missing parts of an RGB image due to smoke or dust. The result demonstrated that the technique greatly improved SLAM performance. In the study, a low frame rate Raytheon Thermal-Eye 2000B thermal system was mounted on the UGV used to capture thermal frames at different times during the day. A lower frame rate of 12.5 fps was sufficient in this case due to the slow movement of the platform, and the framework was not intended for real-time use.
Later, the study in [82] developed a faster and more accurate framework for feature extraction and matching SLAM. The framework was shown to be less computationally demanding while achieving better accuracy. The thermal system specifications are not known due to the authors’ reliance on their own dataset.
The authors in [85] integrated LIDAR into the thermal system to introduce depth information into their LIDAR SLAM system. The team also used a radiometric FLIR A65 sensor with full 14 bit thermal data in their experiment. The SLAM algorithm was modified to work with 14 bit thermal data and demonstrated a better overall performance compared to previous 8 bit SLAM techniques.

11.2. Odometry

The team in [83] proposed a combined spectrum of visual odometry for ground vehicles. The study used several thermal and optical sensors in their system configuration to collect data. Log-Gabor wavelets and the optical LK algorithm were used for feature extraction and matching. The study showed that thermal data can enhance the performance of the stereo-odometry system.
Later, the same team adapted their techniques for UAVs with a lighter and smaller thermal sensor, the FLIR Tau2. The study relied on the 8 bit processed thermal frames. To overcome the troublesome AGC correction, the AGC threshold and reaction time was modified to prevent sudden change in image contrast. This modification has proven to work with their collected dataset but did not completely solve the problem.
Using the same sensor, the same group further performed three studies on odometry in order to estimate the state of UAVs. Firstly, the authors [84] utilised 8 bit thermal data with the extended Kalman filter framework for feature extraction and matching. Later, they experimented with thermal markers [48] with very high emissivity, as described in Section 4.3. The thermal markers were applied throughout the test site as distinctive landmarks that are visible in thermal frames that could be used for odometry estimation.
In their most recent study [86], the team proposed a thermal-inertial odometry estimation framework utilising full thermal 14 bit data. The odometry of the platform was estimated from the fusion of inertial measurements and jointly minimized the reprojection errors of 3D landmarks and inertial measurement errors. The 14 bit system was shown to be more resilient to the loss of information caused by the re-scaling process.

11.3. Other Applications

Rosser [63] introduced the concept of thermal flow, which is optical flow estimation from thermal imaging. They introduced a bit-depth conversion process that maintained contrast across a pair of images Section 8.1. Due to the nature of optical flow algorithms and odometry applications, a lower resolution and frame rate thermal sensor was still sufficient. This is also the only study that used a fixed-wing aircraft.
In the important domain of deep learning, Saputra et al. [110] utilised the best commercial thermal OEM modules on the market, the FLIR Boson and the FLIR E95, at a frame rate at 60 fps for data collection.
Finally, Table 4 shows a summary works reviewed in this article.

12. Discussion

Despite thermal cameras being available for research use for many years, there appears to have been limited efforts to date with thermal imaging for UAV navigation. One of the reasons is the popularity of very low cost GPS receivers despite the challenge presented when it is unavailable.
The cost of thermal sensors is prohibitively high compared to visible light cameras. For example, FLIR Tau2 costs almost $5000 in June 2021 [113]. Hence, it can greatly increase the initial cost of a research project or dramatically increase the cost of a commercial system.
High performance thermal sensors such as the FLIR Tau2 and the FLIR BOSON are under export control by the United State Department of Commerce [114]. For example, all uncooled thermal sensors with frame rates above 9 Hz can only be exported to Strategic Trade Authorized (STA) countries. Some territories are completely barred from importing any thermal systems, while for other countries the sensors are modified to lower frame rates. Lower frame rates are undesirable for aerial applications because many algorithms expect frame rates higher than the dynamics of the system.
Night operations of UAVs for conducting experimentation are also a challenge. Many jurisdictions require the operator to have an appropriate operator license. For example, in Australia, according to the Civil Aviation and Safety Authority (CASA) [115], only trained pilots are allowed to operate the drones at night and must be far from urban areas. This introduces challenges in data collection, trial and testing processes.
Another issue is the resolution limitation of available thermal sensors. The highest resolution radiometric LWIR from FLIR that can satisfy size and weights constraints for small drone applications is the FLIR Tau2 at 640 × 480 pixels. In comparison, RGB Picam v2 [116] has ten times the total pixels at less than one hundredth of the price. Additionally, thermal data contain less information to work with, such as texture, colour or edges [117]. For example, objects with colour textures in colour images will usually appear uniform in thermal data, which results in fewer detected features in thermal imagery.
Thermal images change in appearance over time due to heating and cooling cycles of the scene. Unlike reflected visible light, thermal landmarks data are highly influenced by uncontrolled external sources. For example, as shown in [47], thermal data can be influenced by relative humidity, air convection, reflective surface and other heat sources such as the sun. As a result, thermal landmarks can be vastly different from early morning to late afternoon from which it is reasonable to expect problems with feature based tracking algorithms such as SLAM and optical flow.
Correction conversion techniques such as AGC and FFC/NUC also have a negative impact on image repeatability in re-scaled 8 bit data. Even though there are some proposed workaround methods to overcome this issue, full radiometric data are enormously favoured for producing consistent results. The downside of using full thermal data is it may not work with standard computer vision libraries such as OpenCV. As a result, the development phase can take longer and require more effort. Nevertheless, we strongly recommend future studies to use full radiometric raw data.

13. Conclusions

Despite the potential of thermal imaging, limited efforts are being applied to visual navigation to date. Many challenges remain to be fully overcome, including the higher price tag of the sensor, difficulty in obtaining sensors in some parts of the world, thermal landmarks changing through diurnal and seasonal cycles, lack of texture and low resolution, uncontrollable internal image re-scaling and correction.
Thermal sensors can provide valuable information for navigation fusion systems. Thermal sensors can reveal hidden textures behind fog, smoke or darkness and thermal masses underground, which enhance the overall performance of the fusion system in visually degraded conditions.
With the introduction of more portable radiometric sensors, standalone thermal sensor systems can produce good results in visual odometry, VSLAM, optical flow and deep learning. Furthermore, after presenting problems with associated re-scaled thermal data, we highlighted that using full radiometric data should be selected to bypass the re-scaling processes in order to increase the reliability of the system.
The field is not yet mature. Thermal sensors have not been applied in every way that optical sensors have been used. Their different mode of operation of re-radiation rather than reflection has not been exploited apart from offering the ability to see in different conditions than optical sensors. Thermal data changes during the heating and cooling cycles during the day, and this effect remains to be used between seasons. While the environments in which optical sensors are compromised are well known to most humans before even running an experiment, thermal sensors require research and calculation in order to establish where success and failure might occur. As a consequence, thermal sensor strengths and weaknesses for navigation have not been comprehensively considered.
To date, there has been limited effort on researching deep learning techniques that show potential at low resolutions. Considering how successful deep learning and convolutional neural networks in the computer vision field have been, more work needs to be performed in order to overcome the challenges of low resolution in thermal imagery.
Similarly, attention should be given to map-less systems due to their low computational demand and robustness. There is also limited research on fixed-wing aircraft and unmanned ground vehicles. The use of cooled thermal sensors should also be considered due to their superior noise performance and resolution, particularly in cold conditions and low thermal contrast situations.
Our future work will focus on low resolution and low cost thermal sensors as an adjunct to existing navigation systems. We believe this area has the largest opportunity to provide enough value in achieving ubiquitous thermal sensor integration in UAV navigation systems.

Author Contributions

T.X.B.N. performed the literature review and wrote the draft manuscript. K.R. and J.C. reviewed the work and contributed with valuable advice. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by an Australian Government Research Training Program (RTP) Scholarship.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
LWIRLong Wavelength Infrared;
AGCAutomatic Gain Control;
NUCNon-Uniformity Correction;
UAVUnmanned Aerial Vehicle;
UGVUnmanned Ground Vehicle;
SLAMSimultaneous Localisation and Mapping;
TFThermal Flow;
OFOptical Flow;
LKLucas Kanade.

References

  1. Keane, J.F.; Carr, S.S. A brief history of early unmanned aircraft. Johns Hopkins APL Tech. Dig. 2013, 32, 558–571. [Google Scholar]
  2. Nikolic, J.; Burri, M.; Rehder, J.; Leutenegger, S.; Huerzeler, C.; Siegwart, R. A UAV system for inspection of industrial facilities. In Proceedings of the 2013 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2013; pp. 1–8. [Google Scholar]
  3. Omari, S.; Gohl, P.; Burri, M.; Achtelik, M.; Siegwart, R. Visual industrial inspection using aerial robots. In Proceedings of the IEEE 2014 3rd International Conference on Applied Robotics for the Power Industry, Foz do Iguassu, Brazil, 14–16 October 2014; pp. 1–5. [Google Scholar]
  4. Hodgson, J.C.; Baylis, S.M.; Mott, R.; Herrod, A.; Clarke, R.H. Precision wildlife monitoring using unmanned aerial vehicles. Sci. Rep. 2016, 6, 1–7. [Google Scholar] [CrossRef] [Green Version]
  5. Gonzalez, L.F.; Montes, G.A.; Puig, E.; Johnson, S.; Mengersen, K.; Gaston, K.J. Unmanned aerial vehicles (UAVs) and artificial intelligence revolutionizing wildlife monitoring and conservation. Sensors 2016, 16, 97. [Google Scholar] [CrossRef] [Green Version]
  6. Scherer, J.; Yahyanejad, S.; Hayat, S.; Yanmaz, E.; Andre, T.; Khan, A.; Vukadinovic, V.; Bettstetter, C.; Hellwagner, H.; Rinner, B. An autonomous multi-UAV system for search and rescue. In Proceedings of the First Workshop on Micro Aerial Vehicle Networks, Systems, and Applications for Civilian Use, Florence, Italy, 18 May 2015; pp. 33–38. [Google Scholar]
  7. Al-Naji, A.; Gibson, K.; Lee, S.H.; Chahl, J. Monitoring of cardiorespiratory signal: Principles of remote measurements and review of methods. IEEE Access 2017, 5, 15776–15790. [Google Scholar] [CrossRef]
  8. Perera, A.G.; Law, Y.W.; Chahl, J. Drone-action: An outdoor recorded drone video dataset for action recognition. Drones 2019, 3, 82. [Google Scholar] [CrossRef] [Green Version]
  9. Al-Naji, A.; Perera, A.G.; Mohammed, S.L.; Chahl, J. Life signs detector using a drone in disaster zones. Remote Sens. 2019, 11, 2441. [Google Scholar] [CrossRef] [Green Version]
  10. Al-Naji, A.; Perera, A.G.; Chahl, J. Remote monitoring of cardiorespiratory signals from a hovering unmanned aerial vehicle. Biomed. Eng. Online 2017, 16, 1–20. [Google Scholar] [CrossRef] [Green Version]
  11. Molina, P.; Colomina, I.; Victoria, T.; Skaloud, J.; Kornus, W.; Prades, R.; Aguilera, C. Searching lost people with UAVs: The system and results of the close-search project. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, 39, 441–446. [Google Scholar] [CrossRef] [Green Version]
  12. Bein, D.; Bein, W.; Karki, A.; Madan, B.B. Optimizing border patrol operations using unmanned aerial vehicles. In Proceedings of the IEEE 2015 12th International Conference on Information Technology-New Generations, Las Vegas, NV, USA, 13–15 April 2015; pp. 479–484. [Google Scholar]
  13. Lu, Y.; Xue, Z.; Xia, G.S.; Zhang, L. A survey on vision-based UAV navigation. Geo-Spat. Inf. Sci. 2018, 21, 21–32. [Google Scholar] [CrossRef] [Green Version]
  14. Jiang, Z.; Groves, P.D. NLOS GPS signal detection using a dual-polarisation antenna. GPS Solut. 2014, 18, 15–26. [Google Scholar] [CrossRef] [Green Version]
  15. Van Nee, R.D. Multipath effects on GPS code phase measurements. Navigation 1992, 39, 177–190. [Google Scholar] [CrossRef]
  16. Zefri, Y.; ElKettani, A.; Sebari, I.; Ait Lamallam, S. Thermal infrared and visual inspection of photovoltaic installations by UAV photogrammetry—Application case: Morocco. Drones 2018, 2, 41. [Google Scholar] [CrossRef] [Green Version]
  17. Quater, P.B.; Grimaccia, F.; Leva, S.; Mussetta, M.; Aghaei, M. Light Unmanned Aerial Vehicles (UAVs) for cooperative inspection of PV plants. IEEE J. Photovoltaics 2014, 4, 1107–1113. [Google Scholar] [CrossRef] [Green Version]
  18. Zhang, J.; Jung, J.; Sohn, G.; Cohen, M. Thermal infrared inspection of roof insulation using unmanned aerial vehicles. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 381. [Google Scholar] [CrossRef] [Green Version]
  19. Luque-Vega, L.F.; Castillo-Toledo, B.; Loukianov, A.; Gonzalez-Jimenez, L.E. Power line inspection via an unmanned aerial system based on the quadrotor helicopter. In Proceedings of the IEEE MELECON 2014—2014 17th IEEE Mediterranean Electrotechnical Conference, Beirut, Lebanon, 13–16 April 2014; pp. 393–397. [Google Scholar]
  20. Martinez-De Dios, J.; Ollero, A. Automatic detection of windows thermal heat losses in buildings using UAVs. In Proceedings of the IEEE 2006 World Automation Congress, Budapest, Hungary, 24–26 July 2006; pp. 1–6. [Google Scholar]
  21. Rud, R.; Cohen, Y.; Alchanatis, V.; Levi, A.; Brikman, R.; Shenderey, C.; Heuer, B.; Markovitch, T.; Dar, Z.; Rosen, C.; et al. Crop water stress index derived from multi-year ground and aerial thermal images as an indicator of potato water status. Precis. Agric. 2014, 15, 273–289. [Google Scholar] [CrossRef]
  22. Möller, M.; Alchanatis, V.; Cohen, Y.; Meron, M.; Tsipris, J.; Naor, A.; Ostrovsky, V.; Sprintsin, M.; Cohen, S. Use of thermal and visible imagery for estimating crop water status of irrigated grapevine. J. Exp. Bot. 2007, 58, 827–838. [Google Scholar] [CrossRef] [Green Version]
  23. DeJonge, K.C.; Taghvaeian, S.; Trout, T.J.; Comas, L.H. Comparison of canopy temperature-based water stress indices for maize. Agric. Water Manag. 2015, 156, 51–62. [Google Scholar] [CrossRef]
  24. Allen, R.G.; Tasumi, M.; Morse, A.; Trezza, R.; Wright, J.L.; Bastiaanssen, W.; Kramber, W.; Lorite, I.; Robison, C.W. Satellite-based energy balance for mapping evapotranspiration with internalized calibration (METRIC)—Applications. J. Irrig. Drain. Eng. 2007, 133, 395–406. [Google Scholar] [CrossRef]
  25. Gowda, P.H.; Chavez, J.L.; Colaizzi, P.D.; Evett, S.R.; Howell, T.A.; Tolk, J.A. ET mapping for agricultural water management: Present status and challenges. Irrig. Sci. 2008, 26, 223–237. [Google Scholar] [CrossRef] [Green Version]
  26. Corsi, C. History highlights and future trends of infrared sensors. J. Mod. Opt. 2010, 57, 1663–1686. [Google Scholar] [CrossRef]
  27. FLIR TAU 2 640 × 512 50 MM 12.4°HFOV—LWIR Thermal Imaging Camera Core 30 Hz. Available online: https://www.oemcameras.com/flir-tau-2-640-50mm-thermal-imaging-camera-core.h (accessed on 24 August 2021).
  28. Viper FLIR A65 45—FLIR A65 f = 13 mm (45 Degree lens)—Fixed Focus 640 × 512 Pixels Resolution/30 Hz 62613-0101. Available online: https://www.tequipment.net/Viper/FLIR-A65-45/Fixed-Mount-Thermal-Imagers// (accessed on 24 August 2021).
  29. Teledyne FLIR BOSON 640 × 512 8.7 MM 50° HFOV—LWIR Thermal Camera Core. Available online: hhttps://www.oemcameras.com/flir-boson-series.htm/flir-boson-640x480-8mm.htm (accessed on 24 August 2021).
  30. FLIR Lepton 3.5—500-0771-01. Available online: https://store.groupgets.com/products/flir-lepton-3-5 (accessed on 24 August 2021).
  31. Deane, S.; Avdelidis, N.P.; Ibarra-Castanedo, C.; Zhang, H.; Nezhad, H.Y.; Williamson, A.A.; Mackley, T.; Maldague, X.; Tsourdos, A.; Nooralishahi, P. Comparison of cooled and uncooled ir sensors by means of signal-to-noise ratio for ndt diagnostics of aerospace grade composites. Sensors 2020, 20, 3381. [Google Scholar] [CrossRef]
  32. Phantom Series. Available online: https://www.dji.com/au/products/phantom?site=brandsite&from=nav (accessed on 14 September 2021).
  33. Torun, E. UAV Requirements and Design Consideration; Technical Report; Turkish Land Forces Command Ankara: Ankara, Turkey, 2000. [Google Scholar]
  34. Kuiper, E.; Nadjm-Tehrani, S. Mobility models for UAV group reconnaissance applications. In Proceedings of the IEEE 2006 International Conference on Wireless and Mobile Communications (ICWMC’06), Bucharest, Romania, 29–31 July 2006; p. 33. [Google Scholar]
  35. Meola, C. Infrared Thermography Recent Advances and Future Trends; Bentham Science Publishers: Sharjah, United Arab Emirates, 2012. [Google Scholar]
  36. Meola, C.; Boccardi, S.; Carlomagno, G.M. Infrared Thermography in the Evaluation of Aerospace Composite Materials: Infrared Thermography to Composites; Woodhead Publishing: Sawston, UK, 2016. [Google Scholar]
  37. Silk, J. Cosmic black-body radiation and galaxy formation. Astrophys. J. 1968, 151, 459. [Google Scholar] [CrossRef]
  38. Würfel, P.; Finkbeiner, S.; Daub, E. Generalized Planck’s radiation law for luminescence via indirect transitions. Appl. Phys. A 1995, 60, 67–70. [Google Scholar] [CrossRef]
  39. Brainard, G.C.; Sliney, D.; Hanifin, J.P.; Glickman, G.; Byrne, B.; Greeson, J.M.; Jasser, S.; Gerner, E.; Rollag, M.D. Sensitivity of the human circadian system to short-wavelength (420-nm) light. J. Biol. Rhythm. 2008, 23, 379–386. [Google Scholar] [CrossRef] [Green Version]
  40. The Electromagnetic Spectrum. Available online: https://imagine.gsfc.nasa.gov/science/toolbox/emspectrum1.html (accessed on 30 May 2021).
  41. Kingston, R.H. Detection of Optical and Infrared Radiation; Springer: Berlin/Heidelberg, Germany, 2013; Volume 10. [Google Scholar]
  42. What Are Infrared Waves? Available online: https://science.nasa.gov/ems/07_infraredwaves (accessed on 30 May 2021).
  43. Deming, D.; Seager, S.; Richardson, L.J.; Harrington, J. Infrared radiation from an extrasolar planet. Nature 2005, 434, 740–743. [Google Scholar] [CrossRef] [Green Version]
  44. Becker, F.; Li, Z.L. Surface temperature and emissivity at various scales: Definition, measurement and related problems. Remote Sens. Rev. 1995, 12, 225–253. [Google Scholar] [CrossRef]
  45. Emissivity—Metals. Available online: https://www.flukeprocessinstruments.com/en-us/service-and-support/knowledge-center/infrared-technology/emissivity-metals/ (accessed on 30 May 2021).
  46. How Does Emissivity Affect Thermal Imaging? Available online: https://www.flir.com.au/discover/professional-tools/how-does-emissivity-affect-thermal-imaging/ (accessed on 30 May 2021).
  47. Khanam, F.T.Z.; Chahl, L.A.; Chahl, J.S.; Al-Naji, A.; Perera, A.G.; Wang, D.; Lee, Y.; Ogunwa, T.T.; Teague, S.; Nguyen, T.X.B.; et al. Noncontact Sensing of Contagion. J. Imaging 2021, 7, 28. [Google Scholar] [CrossRef]
  48. Khattak, S.; Papachristos, C.; Alexis, K. Marker based thermal-inertial localization for aerial robots in obscurant filled environments. In International Symposium on Visual Computing; Springer: Berlin/Heidelberg, Germany, 2018; pp. 565–575. [Google Scholar]
  49. Belliveau, R.G.; DeJong, S.A.; Boltin, N.D.; Lu, Z.; Cassidy, B.M.; Morgan, S.L.; Myrick, M. Mid-infrared emissivity of nylon, cotton, acrylic, and polyester fabrics as a function of moisture content. Text. Res. J. 2020, 90, 1431–1445. [Google Scholar] [CrossRef]
  50. Fetić, A.; Jurić, D.; Osmanković, D. The procedure of a camera calibration using Camera Calibration Toolbox for MATLAB. In Proceedings of the IEEE 2012 35th International Convention MIPRO, Opatija, Croatia, 21–25 May 2012; pp. 1752–1757. [Google Scholar]
  51. Wang, Y.; Li, Y.; Zheng, J. A camera calibration technique based on OpenCV. In Proceedings of the IEEE 3rd International Conference on Information Sciences and Interaction Sciences, Chengdu, China, 23–25 June 2010; pp. 403–406. [Google Scholar]
  52. Li, B.; Heng, L.; Koser, K.; Pollefeys, M. A multiple-camera system calibration toolbox using a feature descriptor-based calibration pattern. In Proceedings of the IEEE 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; pp. 1301–1307. [Google Scholar]
  53. Usamentiaga, R.; Garcia, D.; Ibarra-Castanedo, C.; Maldague, X. Highly accurate geometric calibration for infrared cameras using inexpensive calibration targets. Measurement 2017, 112, 105–116. [Google Scholar] [CrossRef]
  54. Knyaz, V.; Moshkantsev, P. Joint geometric calibration of color and thermal cameras for synchronized multimodal dataset creating. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 79–84. [Google Scholar] [CrossRef] [Green Version]
  55. Engström, P.; Larsson, H.; Rydell, J. Geometric calibration of thermal cameras. In Electro-Optical Remote Sensing, Photonic Technologies, and Applications VII; and Military Applications in Hyperspectral Imaging and High Spatial Resolution Sensing; International Society for Optics and Photonics: Bellingham, WA, USA, 2013; Volume 8897, p. 88970C. [Google Scholar]
  56. Luhmann, T.; Piechel, J.; Roelfs, T. Geometric calibration of thermographic cameras. In Thermal Infrared Remote Sensing; Springer: Berlin/Heidelberg, Germany, 2013; pp. 27–42. [Google Scholar]
  57. Yang, R.; Yang, W.; Chen, Y.; Wu, X. Geometric calibration of IR camera using trinocular vision. J. Light. Technol. 2011, 29, 3797–3803. [Google Scholar] [CrossRef]
  58. Vidas, S.; Lakemond, R.; Denman, S.; Fookes, C.; Sridharan, S.; Wark, T. A mask-based approach for the geometric calibration of thermal-infrared cameras. IEEE Trans. Instrum. Meas. 2012, 61, 1625–1635. [Google Scholar] [CrossRef] [Green Version]
  59. FLIR Corp. FLIR Lepton Engineering Data Sheet; FLIR Corp.: Wilsonville, OR, USA, 2014. [Google Scholar]
  60. Bloesch, M.; Burri, M.; Omari, S.; Hutter, M.; Siegwart, R. Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback. Int. J. Robot. Res. 2017, 36, 1053–1072. [Google Scholar] [CrossRef] [Green Version]
  61. Mouats, T.; Aouf, N.; Chermak, L.; Richardson, M.A. Thermal stereo odometry for UAVs. IEEE Sens. J. 2015, 15, 6335–6347. [Google Scholar] [CrossRef] [Green Version]
  62. Papachristos, C.; Mascarich, F.; Alexis, K. Thermal-inertial localization for autonomous navigation of aerial robots through obscurants. In Proceedings of the IEEE 2018 International Conference on Unmanned Aircraft Systems (ICUAS), Dallas, TX, USA, 12–15 June 2018; pp. 394–399. [Google Scholar]
  63. Rosser, K.; Nguyen, T.X.B.; Moss, P.; Chahl, J. Low complexity visual UAV track navigation using long-wavelength infrared. J. Field Robot. 2021. [Google Scholar] [CrossRef]
  64. Bradski, G.; Kaehler, A. OpenCV. Dr. Dobb’s J. Softw. Tools 2000, 3, 20–245. [Google Scholar]
  65. Shi, J. Good features to track. In Proceedings of the 1994 IEEE Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 21–23 June 1994; pp. 593–600. [Google Scholar]
  66. Lucas, B.D.; Kanade, T. An Iterative Image Registration Technique With an Application to Stereo Vision. In Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI), Vancouver, BC, Canada, 24–28 August 1981. [Google Scholar]
  67. Borges, P.V.K.; Vidas, S. Practical infrared visual odometry. IEEE Trans. Intell. Transp. Syst. 2016, 17, 2205–2213. [Google Scholar] [CrossRef]
  68. Durrant-Whyte, H.; Dissanayake, M.; Gibbens, P. Toward deployment of large scale simultaneous localisation and map building (SLAM) systems. In Robotics Research; Springer: Berlin/Heidelberg, Germany, 2000; pp. 161–167. [Google Scholar]
  69. Mur-Artal, R.; Montiel, J.M.M.; Tardos, J.D. ORB-SLAM: A versatile and accurate monocular SLAM system. IEEE Trans. Robot. 2015, 31, 1147–1163. [Google Scholar] [CrossRef] [Green Version]
  70. Leonard, J.J.; Durrant-Whyte, H.F. Mobile robot localization by tracking geometric beacons. IEEE Trans. Robot. Autom. 1991, 7, 376–382. [Google Scholar] [CrossRef]
  71. Montemerlo, M.; Thrun, S.; Koller, D.; Wegbreit, B. FastSLAM: A factored solution to the simultaneous localization and mapping problem. In Proceedings of the AAAI/IAAI 2002, Edmonton, AB, Canada, 28 July–1 August 2002; pp. 593–598. [Google Scholar]
  72. Montemerlo, M.; Thrun, S.; Koller, D.; Wegbreit, B. FastSLAM 2.0: An improved particle filtering algorithm for simultaneous localization and mapping that provably converges. In Proceedings of the IJCAI, Acapulco, Mexico, 9–15 August 2003; Volume 3, pp. 1151–1156. [Google Scholar]
  73. Dellaert, F.; Kaess, M. Square Root SAM: Simultaneous localization and mapping via square root information smoothing. Int. J. Robot. Res. 2006, 25, 1181–1203. [Google Scholar] [CrossRef] [Green Version]
  74. Kim, C.; Sakthivel, R.; Chung, W.K. Unscented FastSLAM: A robust algorithm for the simultaneous localization and mapping problem. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; pp. 2439–2445. [Google Scholar]
  75. Taketomi, T.; Uchiyama, H.; Ikeda, S. Visual SLAM algorithms: A survey from 2010 to 2016. IPSJ Trans. Comput. Vis. Appl. 2017, 9, 1–11. [Google Scholar] [CrossRef]
  76. Maddern, W.; Vidas, S. Towards robust night and day place recognition using visible and thermal imaging. In Proceedings of the RSS 2012 Workshop: Beyond Laser and Vision: Alternative Sensing Techniques for Robotic Perception; University of Sydney: Sydney, Australia, 2012; pp. 1–6. [Google Scholar]
  77. Poujol, J.; Aguilera, C.A.; Danos, E.; Vintimilla, B.X.; Toledo, R.; Sappa, A.D. A visible-thermal fusion based monocular visual odometry. In Robot 2015: Second Iberian Robotics Conference; Springer: Berlin/Heidelberg, Germany, 2016; pp. 517–528. [Google Scholar]
  78. Rasmussen, N.D.; Morse, B.S.; Goodrich, M.A.; Eggett, D. Fused visible and infrared video for use in wilderness search and rescue. In Proceedings of the IEEE 2009 Workshop on Applications of Computer Vision (WACV), Snowbird, UT, USA, 7–8 December 2009; pp. 1–8. [Google Scholar]
  79. Geiger, A.; Ziegler, J.; Stiller, C. Stereoscan: Dense 3d reconstruction in real-time. In Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV), Baden-Baden, Germany, 5–9 June 2011; pp. 963–968. [Google Scholar]
  80. Brunner, C.; Peynot, T. Visual metrics for the evaluation of sensor data quality in outdoor perception. In Proceedings of the 10th Performance Metrics for Intelligent Systems Workshop, Baltimore, MD, USA, 28–30 September 2010; pp. 1–8. [Google Scholar]
  81. Brunner, C.; Peynot, T.; Vidal-Calleja, T.; Underwood, J. Selective combination of visual and thermal imaging for resilient localization in adverse conditions: Day and night, smoke and fire. J. Field Robot. 2013, 30, 641–666. [Google Scholar] [CrossRef] [Green Version]
  82. Chen, L.; Sun, L.; Yang, T.; Fan, L.; Huang, K.; Xuanyuan, Z. RGB-T SLAM: A flexible SLAM framework by combining appearance and thermal information. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 5682–5687. [Google Scholar]
  83. Mouats, T.; Aouf, N.; Sappa, A.D.; Aguilera, C.; Toledo, R. Multispectral stereo odometry. IEEE Trans. Intell. Transp. Syst. 2014, 16, 1210–1224. [Google Scholar] [CrossRef]
  84. Khattak, S.; Papachristos, C.; Alexis, K. Visual-thermal landmarks and inertial fusion for navigation in degraded visual environments. In Proceedings of the 2019 IEEE Aerospace Conference, Big Sky, MT, USA, 2–9 March 2019; pp. 1–9. [Google Scholar]
  85. Shin, Y.S.; Kim, A. Sparse depth enhanced direct thermal-infrared slam beyond the visible spectrum. IEEE Robot. Autom. Lett. 2019, 4, 2918–2925. [Google Scholar] [CrossRef] [Green Version]
  86. Khattak, S.; Papachristos, C.; Alexis, K. Keyframe-based thermal–inertial odometry. J. Field Robot. 2020, 37, 552–579. [Google Scholar] [CrossRef]
  87. Horn, B.K.; Schunck, B.G. Determining optical flow. Artif. Intell. 1981, 17, 185–203. [Google Scholar] [CrossRef] [Green Version]
  88. Chahl, J.; Mizutani, A. Biomimetic attitude and orientation sensors. IEEE Sens. J. 2010, 12, 289–297. [Google Scholar] [CrossRef]
  89. Chahl, J.S.; Srinivasan, M.V.; Zhang, S.W. Landing strategies in honeybees and applications to uninhabited airborne vehicles. Int. J. Robot. Res. 2004, 23, 101–110. [Google Scholar] [CrossRef]
  90. Srinivasan, M.V.; Zhang, S.; Altwein, M.; Tautz, J. Honeybee Navigation: Nature and Calibration of the “Odometer”. Science 2000, 287, 851–853. [Google Scholar] [CrossRef]
  91. Srinivasan, M.V. Honey bees as a model for vision, perception, and cognition. Annu. Rev. Entomol. 2010, 55, 267–284. [Google Scholar] [CrossRef]
  92. Rosser, K.; Chahl, J. Reducing the complexity of visual navigation: Optical track controller for long-range unmanned aerial vehicles. J. Field Robot. 2019, 36, 1118–1140. [Google Scholar] [CrossRef]
  93. Honegger, D.; Meier, L.; Tanskanen, P.; Pollefeys, M. An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1736–1741. [Google Scholar]
  94. Yoo, D.W.; Won, D.Y.; Tahk, M.J. Optical flow based collision avoidance of multi-rotor uavs in urban environments. Int. J. Aeronaut. Space Sci. 2011, 12, 252–259. [Google Scholar] [CrossRef] [Green Version]
  95. Grabe, V.; Bülthoff, H.H.; Giordano, P.R. On-board velocity estimation and closed-loop control of a quadrotor UAV based on optical flow. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, St Paul, MN, USA, 14–18 May 2012; pp. 491–497. [Google Scholar]
  96. Miller, A.; Miller, B.; Popov, A.; Stepanyan, K. UAV landing based on the optical flow videonavigation. Sensors 2019, 19, 1351. [Google Scholar] [CrossRef] [Green Version]
  97. Gageik, N.; Strohmeier, M.; Montenegro, S. An autonomous UAV with an optical flow sensor for positioning and navigation. Int. J. Adv. Robot. Syst. 2013, 10, 341. [Google Scholar] [CrossRef]
  98. Bradski, G.; Kaehler, A. Learning OpenCV: Computer Vision with the OpenCV Library; O’Reilly Media, Inc.: Newton, MA, USA, 2008. [Google Scholar]
  99. Smolyanskiy, N.; Kamenev, A.; Smith, J.; Birchfield, S. Toward low-flying autonomous MAV trail navigation using deep neural networks for environmental awareness. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; pp. 4241–4247. [Google Scholar]
  100. McGuire, K.; De Croon, G.; De Wagter, C.; Tuyls, K.; Kappen, H. Efficient optical flow and stereo vision for velocity estimation and obstacle avoidance on an autonomous pocket drone. IEEE Robot. Autom. Lett. 2017, 2, 1070–1076. [Google Scholar] [CrossRef] [Green Version]
  101. Heng, L.; Honegger, D.; Lee, G.H.; Meier, L.; Tanskanen, P.; Fraundorfer, F.; Pollefeys, M. Autonomous visual mapping and exploration with a micro aerial vehicle. J. Field Robot. 2014, 31, 654–675. [Google Scholar] [CrossRef]
  102. Zhang, X.; Xian, B.; Zhao, B.; Zhang, Y. Autonomous flight control of a nano quadrotor helicopter in a GPS-denied environment using on-board vision. IEEE Trans. Ind. Electron. 2015, 62, 6392–6403. [Google Scholar] [CrossRef]
  103. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
  104. Choi, Y.; Kim, N.; Hwang, S.; Kweon, I.S. Thermal image enhancement using convolutional neural network. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 223–230. [Google Scholar]
  105. Dong, C.; Loy, C.C.; He, K.; Tang, X. Image super-resolution using deep convolutional networks. IEEE Trans. Pattern Anal. Mach. Intell. 2015, 38, 295–307. [Google Scholar] [CrossRef] [Green Version]
  106. Yang, J.; Wright, J.; Huang, T.S.; Ma, Y. Image super-resolution via sparse representation. IEEE Trans. Image Process. 2010, 19, 2861–2873. [Google Scholar] [CrossRef]
  107. Choi, Y.; Kim, N.; Hwang, S.; Park, K.; Yoon, J.S.; An, K.; Kweon, I.S. KAIST multi-spectral day/night data set for autonomous and assisted driving. IEEE Trans. Intell. Transp. Syst. 2018, 19, 934–948. [Google Scholar] [CrossRef]
  108. Kristoffersen, M.S.; Dueholm, J.V.; Gade, R.; Moeslund, T.B. Pedestrian counting with occlusion handling using stereo thermal cameras. Sensors 2016, 16, 62. [Google Scholar] [CrossRef] [Green Version]
  109. Barrera, F.; Lumbreras, F.; Sappa, A.D. Multispectral piecewise planar stereo using Manhattan-world assumption. Pattern Recognit. Lett. 2013, 34, 52–61. [Google Scholar] [CrossRef]
  110. Saputra, M.R.U.; de Gusmao, P.P.; Lu, C.X.; Almalioglu, Y.; Rosa, S.; Chen, C.; Wahlström, J.; Wang, W.; Markham, A.; Trigoni, N. Deeptio: A deep thermal-inertial odometry with visual hallucination. IEEE Robot. Autom. Lett. 2020, 5, 1672–1679. [Google Scholar] [CrossRef] [Green Version]
  111. Clark, R.; Wang, S.; Wen, H.; Markham, A.; Trigoni, N. Vinet: Visual-inertial odometry as a sequence-to-sequence learning problem. In Proceedings of the AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, 4–9 February 2017; Volume 31. [Google Scholar]
  112. Chen, C.; Rosa, S.; Miao, Y.; Lu, C.X.; Wu, W.; Markham, A.; Trigoni, N. Selective sensor fusion for neural visual-inertial odometry. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 16–20 June 2019; pp. 10542–10551. [Google Scholar]
  113. Longwave Infrared Thermal Camera Core Tau™ 2. Available online: https://www.flir.com.au/products/tau-2/ (accessed on 24 June 2021).
  114. FLIR Exportation Conditions. Available online: https://www.flircameras.com/export_conditions (accessed on 24 June 2021).
  115. Flying at Night. Available online: https://www.casa.gov.au/drones/reoc/flying-at-night/ (accessed on 24 June 2021).
  116. Pi Camera Module v2. Available online: https://www.raspberrypi.org/documentation/hardware/camera/ (accessed on 24 June 2021).
  117. Treible, W.; Saponaro, P.; Sorensen, S.; Kolagunda, A.; O’Neal, M.; Phelan, B.; Sherbondy, K.; Kambhamettu, C. Cats: A color and thermal stereo benchmark. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2961–2969. [Google Scholar]
Figure 1. Comparison of wavelength, frequency and energy in the electromagnetic spectrum [40] (credit: NASA).
Figure 1. Comparison of wavelength, frequency and energy in the electromagnetic spectrum [40] (credit: NASA).
Jimaging 07 00217 g001
Figure 2. Sample of infrared radiation of different materials [46].
Figure 2. Sample of infrared radiation of different materials [46].
Jimaging 07 00217 g002
Figure 3. Change of contrast caused by AGC when a hot object moves into a scene.
Figure 3. Change of contrast caused by AGC when a hot object moves into a scene.
Jimaging 07 00217 g003
Figure 4. A technique to re-scale from 14 bit to 8 bit images based on max and min pixel intensities [63].
Figure 4. A technique to re-scale from 14 bit to 8 bit images based on max and min pixel intensities [63].
Jimaging 07 00217 g004
Figure 5. The PX4FLOW optical flow sensor with mounted lens on the left and ultrasonic sensor on the right [93].
Figure 5. The PX4FLOW optical flow sensor with mounted lens on the left and ultrasonic sensor on the right [93].
Jimaging 07 00217 g005
Figure 6. Optical flow (OF) and thermal flow (TF) comparison during (a) day and (b) night flights. (a,b) Each include four plots. From the top to bottom sections are the following: OF and TF X-axis angular displacement (radians/second); OF and TF Y-axis angular displacement (radians/second); OF and TF Manhattan distance angular displacement; and Normalized cross-correlation of the OF and TF Manhattan distance displacement signals (dimensionless). Results for the day flight shows high correlation between optical and thermal values, while the night flight shows low correlation as visible spectrum OF is degraded in low light [63].
Figure 6. Optical flow (OF) and thermal flow (TF) comparison during (a) day and (b) night flights. (a,b) Each include four plots. From the top to bottom sections are the following: OF and TF X-axis angular displacement (radians/second); OF and TF Y-axis angular displacement (radians/second); OF and TF Manhattan distance angular displacement; and Normalized cross-correlation of the OF and TF Manhattan distance displacement signals (dimensionless). Results for the day flight shows high correlation between optical and thermal values, while the night flight shows low correlation as visible spectrum OF is degraded in low light [63].
Jimaging 07 00217 g006
Table 1. Specifications of thermal sensors presented in this study.
Table 1. Specifications of thermal sensors presented in this study.
SensorDimensionWeightResolutionFpsRadiometricPowerPlatformCostReleased
Thermal-Eye 2000B282 × 279 × 290 mm4.54 kg320 × 24012.5No28 WUGVdiscontinuedn/a
Gobi-640-GigE49 × 49 × 79 mm263 g640 × 48050No4.5 WUGVdiscontinued2008
Miricle 307 K45 × 52 × 48 mm95 g640 × 48015No3.3 WUAVdiscontinued2006
FLIR Tau244.5 × 44.5 × 30 mm<70 g640 × 48060Yes1 WUAV$6500 [27]2015
FLIR A65120 × 125 × 280 mm200 g640 × 51230Yes3.5 WUAV$7895 [28]2016
FLIR Boson21 × 21 × 11 mm7.5 g640 × 51260Yes0.5 WUAV$3520 [29]2020
FLIR Lepton 3.510.5 × 12.7 × 7.14 mm0.9 g160 × 1208.7Yes0.15 WUAV$199 [30]2018
Table 2. Emissivity values for some materials [45].
Table 2. Emissivity values for some materials [45].
MaterialEmissivity Value
MetalAluminium: oxidised0.4
Aluminium: polished0.05
Brass: oxidised0.6
Brass: polished0.02
Copper: oxidised0.71
Copper: polished0.03
Non-metalClay0.95
Ice0.98
Rubber0.95
Water0.93
Glass0.98
Table 3. Settings parameters for LK technique and Shi–Tomasi corner detection algorithm [63].
Table 3. Settings parameters for LK technique and Shi–Tomasi corner detection algorithm [63].
Feature Detection SettingMaximum corners1000
Quality level0.02
Minimum distance5
Block size5
LK SettingsWindow size(15,15)
Maximum pyramid level2
Search termination count10
Search termination ϵ 0.03
Table 4. Summary of presented works.
Table 4. Summary of presented works.
WorkSensors Configuration8-Bit/14-BitSensor NameResolutionFPSNavigation SystemNavigation Task
Maddern and Vidas [76]Combine8Thermoteknix Miricle 307K640 × 48015Map-buildingMapping
Brunner et al. [81]Combine8Raytheon Thermal-Eye 2000B480 × 57612.5Map-buildingVisual-SLAM
Shin et al. [85]Thermal only14FLIR A65640 × 51230Map-buildingVisual-SLAM
Chen et al. [82]Combinen/an/an/an/aMap-buildingVisual-SLAM
Mouats et al. [83]Combine8Gobi-640-GigE from Xenics640 × 48050Map-buildingStereo odometry
Mouats et al. [61]Thermal only8FLIR Tau2640 × 48030Map-buildingStereo odometry
Poujol et al. [77]Combine8Gobi-640-GigE from Xenics640 × 48050Map-buildingOdometry
Khattak et al. [84]Combine8FLIR Tau2640 × 48030Map-buildingOdometry
Khattak et al. [48]Thermal only8FLIR Tau2640 × 48030Map-buildingOdometry
Khattak et al. [86]Thermal only14FLIR Tau2640 × 48030Map-buildingOdometry
Rosser et al. [63]Thermal only8FLIR Lepton 3.5160 × 1208.7MaplessOdometry
Choi et al. [104]Thermal only8n/an/an/aDeep learningThermal image enhancement
Saputra et al. [110]Thermal only8Flir Boson/FLIR E95640 × 512/464 × 34860/60Deep learningOdometry
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nguyen, T.X.B.; Rosser, K.; Chahl, J. A Review of Modern Thermal Imaging Sensor Technology and Applications for Autonomous Aerial Navigation. J. Imaging 2021, 7, 217. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging7100217

AMA Style

Nguyen TXB, Rosser K, Chahl J. A Review of Modern Thermal Imaging Sensor Technology and Applications for Autonomous Aerial Navigation. Journal of Imaging. 2021; 7(10):217. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging7100217

Chicago/Turabian Style

Nguyen, Tran Xuan Bach, Kent Rosser, and Javaan Chahl. 2021. "A Review of Modern Thermal Imaging Sensor Technology and Applications for Autonomous Aerial Navigation" Journal of Imaging 7, no. 10: 217. https://0-doi-org.brum.beds.ac.uk/10.3390/jimaging7100217

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop