Next Article in Journal
Numerical Analysis of Conjugated Heat Transfer and Thermal Stress Distributions in a High-Temperature Ni-Based Superalloy Turbine Rotor Blade
Next Article in Special Issue
Experimental and Co-Simulation Performance Evaluation of an Earth-to-Air Heat Exchanger System Integrated into a Smart Building
Previous Article in Journal
Decision Support in Selecting a Reliable Strategy for Sustainable Urban Transport Based on Laplacian Energy of T-Spherical Fuzzy Graphs
Previous Article in Special Issue
Circular Economy in the European Construction Sector: A Review of Strategies for Implementation in Building Renovation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integration of Thermal and RGB Data Obtained by Means of a Drone for Interdisciplinary Inventory

1
WIDEBRAND Patryk Paziewski, Tarnowska 36/8, 43-300 Bielsko-Biala, Poland
2
Department of Photogrammetry, Remote Sensing, and Spatial Engineering/Faculty of Geo-Data Science, Geodesy and Environmental Engineering, AGH University of Science and Technology, al. A. Mickiewicza 30, 30-059 Krakow, Poland
*
Author to whom correspondence should be addressed.
Submission received: 23 May 2022 / Revised: 20 June 2022 / Accepted: 6 July 2022 / Published: 7 July 2022
(This article belongs to the Special Issue Advances in Energy-Efficient Buildings)

Abstract

:
Thermal infrared imagery is very much gaining in importance in the diagnosis of energy losses in cultural heritage through non-destructive measurement methods. Hence, owing to the fact that it is a very innovative and, above all, safe solution, it is possible to determine the condition of the building, locate places exposed to thermal escape, and plan actions to improve the condition of the facility. The presented work is devoted to the technology of creating a dense point cloud and a 3D model, based on data obtained from UAV. It has been shown that it is possible to build a 3D point model based on thermograms with the specified accuracy by using thermal measurement marks and the dense matching method. The results achieved in this way were compared and, as the result of this work, the model obtained from color photos was integrated with the point cloud created on the basis of the thermal images. The discussed approach exploits measurement data obtained with three independent devices (tools/appliances): a Matrice 300 RTK drone (courtesy of NaviGate); a Phantom 4 PRO drone; and a KT-165 thermal imaging camera. A stone church located in the southern part of Poland was chosen as the measuring object.

Graphical Abstract

1. Introduction

The research purpose was to investigate the application of thermal and visual data integration for a heritage object in order to determine areas of energy losses. The discussed approach exploits measurement data obtained with three independent data acquisition platforms: a Matrice 300 RTK drone; a Phantom 4 PRO drone; and a KT-165 thermal imaging camera. The parish church of the Birth of the Blessed Virgin Mary in the town of Porąbka was selected as the subject of the study. The facility was built of stone, has two towers, and its history dates back 115 years. The first and most interesting sensor used for the measurement was a Zenmuse H20T thermal imaging camera mounted on DJI unmanned aerial vehicle (UAV) Matrice 300 RTK, used courtesy of NaviGate. The RGB photos were captured by two cameras on the Matrice 300 RTK, and a standard 20Mpix camera on the second UAV used for the project, which was the Phantom 4 PRO. Additionally, the device used during the measurement was the KT-165 thermal imaging camera, on the basis of which a thermal analysis of the entire object was performed. As the final product, four independent point clouds and four independent models were obtained, which were further analyzed. The first point cloud was from the RGB camera on the Phantom 4 PRO, the second point cloud was from the RGB “wide” camera on the Matrice 300 RTK, the third point cloud was from the RGB “wide” and “zoom” cameras on the Matrice 300 RTK, and the fourth point cloud was from a thermal image taken by the Zenmuse H20T camera on the Matrice 300 RTK. The obtained products were assessed, not only in terms of quality, but also their usability and the difficulty of development. Thanks to thermal analysis it was possible to determine the places of thermal escape. The thermal point cloud deserved the most attention, followed by the model obtained on the basis of thermal images from the Matrice 300 RTK. The final stage of the work was data integration, i.e., coloring the 3D model obtained from the Phantom 4 PRO drone with a thermal point cloud from the Matrice 300 RTK.

1.1. Thermography

Thermography is wide remote-sensing data acquisition and processing technology, often applying photogrammetric methods of measurement and results presentation. The thermal sensors are used in different scales and from different distances: from close-range [1,2] that can also be mounted on robots and integrated with other sensors [3], on UAV [4,5], aircrafts or satellites [6]. Thermal information can be captured by different means: by passive methods like typical IRT (infrared thermography) [7,8,9], and active methods such as pulse IRT [10,11] and lock-in thermography [12,13]. The thermal acquisition methods can also be divided according to: mono thermal sensors; multispectral cameras [5,14,15]; or multisensor systems [4,16]. The thermography is used for several objects and purposes. The most common is building thermography [17,18,19], thermal diagnostics of buildings [13,18], humanitarian mine action (HMA) [20,21,22], and photovoltaic inspection [23], heat and air conditioning loss detection [24] or, on a bigger scale, thermal monitoring of urban areas [6]. The largest area of thermography application is heritage object sensing, controlling and monitoring for preservation purposes, especially heritage buildings [9], fresco detachment sizing [1,8], moisture detection [7,19], archeological site sensing [25], and analyzing other different cultural heritage artefacts [10]. Thermography is also applied for crime scene analyses [16], wildlife [26], vineyard condition sensing [27], tree species classification [15], or even flying UAV detection [28], not to mention military applications too.

1.2. UAV Thermography Application

The development of the application of UAV technology to different remote sensing and mapping technologies can also be applied to thermography. Nowadays, UAVs are able to carry different sensors used for thermal acquisition: near-infrared cameras [29], thermal cameras [25,30,31,32,33,34,35,36], but also multispectral cameras [14,27,37,38] and multimodal platforms with RGB, thermal and depth cameras [39]. In addition, thermal sensing can be executed in complex objects with limited access, thanks to UAV flight flexibility; thermal data can be acquired from the sides without sufficient visibility from the ground view or aircraft operation height. The new applications of UAV thermal sensing are presented in several areas: archaeological sites [25,39], heritage buildings [30,33], and modern buildings [31,34], including photovoltaic inspections [23]. UAV thermography is also applied for thermographic DTM generation [25,32], and crop field sensing [37]. In particular, it is used for: precision farming [14], soil moisture [5], leaf condition monitoring [29], and other earth observations: permafrost sensing [35], volcano activity monitoring [36], and even robust person detection [40].

1.3. Thermographic and Other Mapping Data Integration

Integration of thermography with other mapping technologies is increasing its possibilities. There are many positive research results presented in world scientific papers regarding the fusion of methods and fusion of data, where one applicationis thermogram. To systematize all integration solutions, we propose two criteria of division of methods.
The first criterion is the source of the data being integrated with thermal data. There are three basic groups of integration method:
(1)
Thermographic and LiDAR data integration are the first important group of methods presented in the papers [41,42,43,44] and mobile laser scanning (MLS) [45]. Usually, the thermographic point cloud is obtained by colorizing original LiDAR data by thermogram RGB values [46]. Another method is the integration of two-point clouds: a LiDAR and thermographic point cloud as a result of dense matching [47].
(2)
Thermographic and photogrammetric data integration are the second group, in particular using RGB sensors [40,47,48,49,50,51,52], as well as smartphone sensors [53], depth cameras [54,55], RGB visual odometry data [56], multispectral [2,4,38], and hyperspectral data [16]. The presented research is another example of this group.
(3)
The third group, according to this criterion, is thermographic data integration within more complex platforms including different sensors [57,58,59,60,61,62], also mounted on robots [3].
The second criterion of division of different approaches to thermal and other mapping data integration is the method of co-registration in a common coordinate system (or external orientation) of the two datasets to be integrated. Most of the published approaches can be divided into four groups of solutions: calibrated fixed sensors, co-registration by control points and features, point cloud superimposing, and other methods.
(1)
Calibrated fixed sensors as thermal and RGB cameras have known spatial relations between both sets of images thanks to sensor calibration. The RGB images have higher resolution, and they are used for 3D model generation [48]. The accuracy of thermal texture projection on the terrestrial laser scanning (TLS) point cloud is higher [57]. There are successful examples of multi-sensor platforms, where RGB, infrared (IR) and TLS sensors were calibrated together, and a thermally colorized TLS point cloud is obtained [58]. There is even an example of a special robot, designed and developed with a laser scanner, thermal and RGB camera for automated inspection of the buildings [63].
(2)
Another solution is the co-registration of visual or LiDAR data with thermal data by specially selected or automatically found natural or marked points and other features. Points marked by coins can be used for thermal and RGB data fusion as well multispectral and RGB data [2]. Another method of marking the thermal control point is by using of circles of aluminum foil [64]. Between thermal imagery and LiDAR point clouds natural points can be used, selected a priori and traditionally measured by tachymeter [59]. Control points are also used for two-point clouds resulting from dense matching of RGB and IR images [65,66]. In spite of the points, linear features are used for co-registration. Feature extraction of visual and thermal images can be done with the Feature Accelerated Segment Test (FAST) operator and two datasets can be matched by correlation [46].
(3)
The third solution is RGB and thermal point cloud superimposing. The fast global registration (FGR) algorithm can be used for coarse registration of two-point clouds for further refinement of intrinsic and external orientation parameters of thermal images, i.e., before thermal precise texture generation [47]. The iterative closest point (ICP) algorithm is also used for visual and IR point cloud co-registration [67]. Another solution of IR data registration is based on superposition of Structure from Motion (SfM) point cloud with TLS data.
(4)
The last group of solutions are more unique approaches. For instance, the sequence of IR images are matched and GPS/INS data are refined for more precise building model thermal texturing [68]. A different example of registration of thermal data with 3D data is RANSAC-based Efficient Perspective-n-Point (EPnP) registration, which can be applied for fusion of range camera 3D data and 2D thermal data [69].

1.4. 2D and 3D Application of Thermal Data

Thermal data are applied as a thermal image, thermal texture of 3D model or thermal point cloud from Structure from Motion (SfM) or LiDAR. The data are presented and used as planar feature or spatial virtual model.
Planar features are rectified thermograms by projective transformation usually for further application: as images for automatic detection of thermal bridges in buildings [70] or for detection of pathologies in monuments [9]. Another way to obtain planar thermal images, is the generation of thermal orthoimages from thermally textured 3D models of the building [41,71].
Spatial virtual models can be an object of analysis or as products and virtual models presenting results of thermal imaging.
(1)
The objects of analysis are stereo models acquired by stereo thermal cameras, for instance, for pedestrian counting [72] or for integration with other data of robot displacement for refinement of robot (or sensor platform) position and precise 3D point cloud calculation [73].
(2)
A thermal virtual model can be a result of different processes of thermal data. The RGB data can be matched, and a 3D triangular irregular network (TIN) model can be generated and thermally textured by registered images of an RGB and IRT camera [74]. Another approach is based on matching and adjustment of thermal and visual UAV images; the resulting point cloud can be a source of thermal orthophotos [25,71]; the artificial targets are used to register a dense surface model of matched thermograms. A 3D model can be also generated from high resolution laser scanning and low-resolution thermal imaging, which are able to be successfully integrated by convolutional neural networks [75]. The other 3D thermographic objects are results of thermal texturing of existing 3D models [76] or 3D mapping of multispectral images on a TLS point cloud [57].
The application of dense matching of the thermal imagery is more common at this time. The thermal point clouds from the photogrammetric level were applied for city thermal analysis and geometrically compared with LiDAR data [77]. On a bigger scale, the mobile mapping infrared data were used for building mapping and dense point cloud generation [78]. The UAV thermal and RGB fusion leading to thermal map modeling is an important research subject [79,80]. The methods of dense matching of thermal imagery are applied to deliver information to BIM about thermal conditions of the interior of buildings [60]. The stereo matching of thermal images is used even for small objects and their dynamic thermal measurement [81]. In this context our experiments and approach are very contemporary. We present thermal and RGB data fusion by thermal 3D mapping of monumental object. The usage of combined RGB and thermal data for heritage and complex objects and the novel method of marking the control points by thermal active points is our proposal to develop the methods and fill the gap in existing knowledge.

2. Materials and Methods

2.1. Measuring Equipment

One of the newest and best flying platforms from the DJI Enterprise (Shenzhen, China) series, Matrice 300 RTK (Figure 1a) was used to perform the measurement. The device was loaned for the duration of the work, courtesy of NaviGate (Krakow, Poland). The drone is equipped with a Zenmuse H20T thermal imaging camera, a so-called hybrid, which has a thermal imaging camera, two daylight cameras and a laser rangefinder in one housing. The thermal imaging camera has two important functionalities: it allows for the targeting of a specific point and a determination of its temperature value in real time, but it can also determine the temperature difference on the tested object. The elements of the environment are transferred to the user with the help of the appropriate color scale, depending on their temperature [82]. The temperature range of the camera is from −40 to 150 degrees Celsius (High Gain) and from −40 to 550 degrees Celsius (Low Gain). The H20T camera has a 23× optical zoom and a maximum optical + digital zoom of 200×. The device allows simultaneous recording and photos from three cameras to three separate files [82]. An additional device used for the measurement was the KT-165 (Swidnica, Poland) hand-held thermal imaging camera (Figure 1c) with a spectral range of 8–14 μm. The accuracy of the reading is 2 degrees Celsius or 2% of the reading for an ambient temperature between 15 and 35 degrees Celsius and an object temperature above 0 degrees Celsius [83]. In order to compare and generate another RGB model, the DJI Phantom 4 PRO unmanned aerial vehicle (Figure 1b) was used, the most frequently selected and the most optimal device in terms of value for money. The unmanned aerial vehicle has a built-in camera with a one-inch CMOS matrix with a resolution of 20 Mpix. It is the first camera to use a new mechanical rolling shutter system [84].

2.2. Measurement Marks

Disposable, chemical, non-toxic and non-flammable hand warmers were used as markers for the thermal images. The vendor ensures about 8 to 10 h of heat release with an average temperature of 57 degrees Celsius. The maximum temperature for the heaters is circa 69 degrees Celsius while maintaining appropriate environmental conditions. Before the measurement, a check was performed to confirm how the product would behave when left in an open space and in close contact with a material that absorbs moisture and cold (in this case stone). The results turned out to be satisfactory. Despite being exposed to cold, the hand warmer was still visible on the cold surface. After this stage, the preparation of the photogrammetric control network, which also served as a thermovision control network, began. The bags with the heating substance were shaken to start the heating process by interaction with the air. In order to keep the heat longer, each hand warmer was attached to a small wooden board with a tacker. Thus, it was isolated from the stone. A measuring disc in the form of a checkerboard was glued to each element prepared in this way (Figure 2). This process was to enable precise targeting of the total station at its center. Prepared measuring points were fixed with tape on the church supports (Figure 3). In the end, 16 pieces were placed on the supports only, 5 on the ground and 2 on the church tower. To sum up, there are 23 points of the photogrammetric control network on the entire surface of the church, acting as a thermal matrix. As additional aids, 5 heaters were also scattered on the ground around the church. Their coordinates were determined, but they were only used as control points.
The next stage was the layout of the tacheometric stations around the church. The task required high precision, because at the same time it was necessary to keep the vision between the stations and to measure each point of the photogrammetric control network acting as a thermovision control network from at least two stations. Due to the use of Real Time Network (RTN) technology, the obscuration of the horizon could not be ignored. Six tacheometric stations were created from which the measurement was made (Figure 4). In addition, characteristic pickets at the points of the church, such as the edges of the walls, cornices and the tower, were measured. It was of great importance for the georeference of the created model, especially for the Z axis.
All control network was adjusted, and the errors were about ±0.003 m for XY coordinates and ±0.005 m for Z coordinates.

2.3. Data Acquisition

The most important and necessary element, without which measurement would not be possible, was to make sure that the church was heated during the winter season. Obtaining correct measurement results is possible only and exclusively with a significant temperature difference between the object and its surroundings. The data acquisition process was started from inside the church. The thermometer showed 13 degrees Celsius inside the building, while −5 degrees Celsius was recorded outside. Initially, the weather was favorable for the measurement: it was cloudy and only around noon the first rays of sunlight broke through. The measurement of the interior started from the attic, then moved lower and lower, through the choir, to the part intended for the faithful and church service: the main aisle and sacristy. In the attic, the temperature on the walls and ceiling varied between 2 and 3 degrees Celsius. It was initially concluded that the attic was relatively well insulated. Initial analysis of the photos showed that the greatest heat escape was taking place through doors and windows. Then, photogrammetric raids were carried out with two DJI drones, the Matrice 300 RTK with the Zenmuse H20T camera and the Phantom 4 PRO. The unmanned aerial vehicles took pictures from the lowest floor to the tower itself, moving around the church, climbing a certain distance up after each “round” (Figure 5). At the same time, Matrice took 3 photos: one thermal and two RGB photos: “wide” + “zoom” and saved them to three separate files.

2.4. RGB Images Analysis and Preperation

The photos taken from the measurement with the unmanned aerial vehicle provided information on all camera parameters at the time of taking the photo: aperture, exposure time, focal length, and flash llama mode. In addition, data such as file size and GPS geographic coordinates and altitude were also obtained. Before starting the development of the 3D model, it was necessary to review the photos in terms of sharpening and exposure. Overexposed photos and photos with flare (halations) have been removed. The images obtained from the cameras were neither cropped nor geometrically transformed.

2.5. Analysis of Images from the KT-165 Camera

The analysis of the photos showed that the average temperature of the walls and ceiling in the church attic is around 3 degrees Celsius. Inside the temple, and precisely in the nave, the temperature of the walls was close to 18 degrees Celsius, which was probably largely related to the heaters under the side pews (temperature around 50 degrees Celsius) (Figure 6). The church windows facing the sun had a temperature of about 16 degrees Celsius, and those facing the north were 2 degrees Celsius lower. The temperature of the entrance door was about 11 degrees Celsius (Figure 7), and the sacristy door as much as 18 degrees Celsius (due to sunlight from the outside).

2.6. Model Generation

During the measurement work, four independent datasets were obtained: RGB photos and thermal photos gathered by the Matrice 300 RTK drone raid (zoom and wide photos), photos from the Phantom 4 PRO and from the KT-165 hand-held thermal imaging camera, which were to be used to monitor thermal changes in buildings. In order to generate 3D models from each set of images, it was necessary to use dense matching software. Agisoft Metashape was chosen. Several stages of the point cloud calculation proceeded as follows: thermal images conversion, images import, photo alignment, and internal and external orientation parameter determination. In addition, the control points were measured on the photos. There are five types of accuracy to choose from: highest, high, medium, low, and lowest. The size of the search area and the size of the resulting image will depend on the choice of accuracy parameter. Another important parameter is a method of reference preselection: source, estimated, and sequential.
The data have been divided into four separate projects for model generation. All processes were repeated with all combination of accuracy and reference preselection parameters. The photos obtained from the DJI Phantom 4 PRO unmanned aerial vehicle (678 photos) were aligned and the results of the alignments are attached in Table 1: quantity of tie points and reprojection errors. In the other three projects, materials from the DJI Matrice 300 RTK drone were developed: in one, thermal images; in the second, wide camera photos (1819 photos); and in the third, wide camera photos, thickened with zoom photos (3642). The wide camera variants of photo alignment are shown in Table 2 and wide + zoom variants are shown in Table 3 similarly to Table 1. The variants chosen for dense matching were marked in the tables.
The most difficult task turned out to be the alignment of the thermal images (1670 photos). Trying to fit all the photos at once did not bring about the expected results. Marking the points manually did not help either. In this situation, it was decided to divide the photos into parts according to the height of the raid: the lowest story near the ground, the height of the windows, the height of the cornices, the roof, and the church towers. Table 4 shows the results of all five parts of thermal photo alignment similarly to Table 1, Table 2 and Table 3. Each group was fitted individually, and the missed photos were manually adjusted. In the end, all the parts were put together into one coherent whole using the control points. Then, the cloud was cleared of unnecessary points and on the data prepared in this way, the construction of a dense point cloud based on the elements of internal and external orientation was initiated.
The entire process of creating a dense point cloud is based on depth maps. A well-prepared point cloud is the basis for generating a mesh model. Model resolution presents the field resolution of the developed object models. Special attention should be paid to the fact that the cloud is properly cleaned, prepared, and oriented before starting the model generation. Without this step, the model will turn out to be useless after generation, as all imperfections from the cloud will be transferred to it. With a well-prepared point cloud, generating the model comes down only to running the algorithm. Several improvements can be made as part of the possible improvement of the program work result. One of them is “hole filling”. It is used to ensure the continuity of the model despite the lack of data for specific fragments. In the project, it was mainly important for the thermal imaging model, which had a lot of imperfections after being generated, and also for the small church tower in RGB. Gaps and discontinuities in the model structure are removed by interpolating the color and course of the model based on the surrounding area. Holes in the model may also appear during its cleaning because the standard algorithm tries to ensure that the product created is continuous, so each removal of model fragments will result in a hole being created. Agisoft Metashape lacks a tool to select an area of interest before photo alignment and cut out only one specific part, without disturbing the structure of the others. Table 5 presents the parameters of individual dense point clouds. The GSD, RMSE on control points and quantity of the dense point clouds are presented.

3. Results and Discussion

3.1. Comparison of Point Clouds and Models

In order to compare between dense point clouds, they had to be exported in .obj format and loaded into CloudCompare. The first two clouds that played the greatest role in the project were selected for analysis. It was a thermal cloud and the one obtained with the DJI Phantom 4 PRO drone, which was the most detailed and accurate. At first sight, it can be stated that the thermal imaging cloud is much thinner, which results from the fact that the resolution of the near-infrared camera matrix is smaller, which naturally has to result in a reduced density and number of points. It is characterized by single discontinuities resulting from the lack of a sufficient number of solutions. A comparative analysis of both products, in terms of accuracy and precision, confirms the hypothesis about the need to use a highly accurate RGB cloud. Despite the lack of precise positioning, the spatial orientation of the color point cloud is definitely better. To be precise, the distance is within 10 cm (Figure 8). The analysis of errors on ground control points leads to the conclusion that this is not a uniform error on both clouds, and it mainly results from the thermal imaging cloud. The much lower resolution caused difficulties in the analysis and the course of the algorithm of fitting photos, which resulted in an increase in mapping errors, and as a result, the entire study. However, it made it possible to create a coherent thermographic model of the church building in Porąbka, which was only used to color the model created from RGB photos.
Taking advantage of the fact that the DJI Matrice 300 RTK drone is equipped with a H20T camera, each of the captured thermal frames has its color equivalent, in two formats. One of them is a JPEG image from a fixed focus color camera. The second, on the other hand, was created with the help of a third camera with the ability to change the zoom. At the post-production stage of the obtained data, models were created that used either only wide-angle photos or a set of all photos (including “zoom” and “wide” cameras). The purpose of the second comparison was to check whether increasing the number of photos of the same object would cause a large change in the accuracy and precision of the developed clouds. It is possible to see the biggest difference at first glance. It is the number of generated points and the final size of the created file, which in the case of a point cloud consisting of stereoscopic alignment of over 3500 high-resolution photos, was over 100 GB. This is a significant increase compared to the second product, which is considered a disadvantage. The comparative analysis shows that there are practically no deviations in the horizontal position between the two products. On one side of the church, it is a centimeter towards the thinner cloud, on the other 1.5 cm towards the opposite side (Figure 9).
Each of the compared models was obtained from a different camera. They differed primarily in resolution, but also their performance depended on the prevailing weather conditions, the surface of the photographed object, stabilization of the unmanned aerial vehicle, and the skills of its operator. Submitting RGB photos (with both the Phantom 4 PRO and Matrice 300 RTK), generating a dense cloud and then a mesh on their basis was not problematic at all compared to thermal imagery. RGB mesh models perfectly presented the entire structure, practically fully reflecting what was built by human hands 115 years previously. The models have kept the smallest details, such as the portal above the entrance door, the stained-glass windows, and the clock face on the large tower. All mesh models were generated by Agisoft Metashape. The model obtained from the photos with the Phantom 4 PRO drone presented the best quality (Figure 10). Despite the fact that it was built in the most difficult weather conditions (the sun broke through the clouds during the work), almost all elements of the building have been preserved. The difference is only visible between the south and north sides of the church. The elements that still need to be improved are the crosses on both towers. Due to the turbulence of air between the towers, it was difficult to operate a light and small machine. An attempt to photograph the crosses from a closer distance could fail and risk destroying the object. In the case of the model obtained on the basis of “wide” photos from the Matrice 300 RTK and the “wide” model thickened with “zoom” photos, a much better representation of the tower and the crosses placed on them was observed. The stained-glass windows also looked great. More photos certainly had a positive effect on refining more details, but it was not the optimal solution. Compared to RGB models, the church’s thermal imaging model fared much worse. Significant problems with matching the photos caused numerous shortcomings in the form of “holes” in the object, which could be supplemented with the use of an additional tool in the program. Nevertheless, the obtained model gives a sufficient effect for the thermal analysis of the entire object. No church tower has been fully photographed in this way. An important advantage of the study is the properly preserved structure, colors, and texture coverage (Figure 11).

3.2. Data Integration

The initial scenario assumed the use of a model made of RGB photos as a structure, which are then color-coded according to the information from the thermal imaging. This stage of the project began with selecting the best RGB dataset. During the post-processing of the data, a significant number of point clouds and mesh models in various formats were generated in order to perform the integration. The main problem, however, was further defining the procedure leading to the combination and use of the positive aspects of each of the kits. Solutions were searched for using Agisoft Metashape, CloudCompare, and MeshLab. Ultimately, the mission was successful using the last one. The first problem was that the objects loaded into the program were not in the same place, although they were both supposed to have compatible coordinate systems. It turned out that the program cannot handle data with 7 digits before the decimal point (system 2000, zone 6).
The processed data were exported (again in all formats and types) from Agisoft Metashape, taking into account the shift of the origin of the coordinate system to the point X: 6587700, Y: 5521100, and Z: 300. This operation reduced the coordinates of all points and the entire model to two- or three-digit values. As a result of this operation, the first element of integration was achieved in MeshLab. It was a set of two models of the same object with different contents, but compatible in terms of orientation and position.
Then the RGB model was stripped of color information and an attempt was made to migrate textures from one object to another. After many unsuccessful attempts (including creating a texture in Agisoft Metashape), it was decided to use a thermal imaging point cloud. After a few transformations and preparation steps, the point attributes were transferred from the cloud to the vertices of the RGB model. Interpolation was necessary. This is directly related to the fact that the RGB model was several dozen times more detailed, so the 1:1 projection was not possible, as there were no corresponding points on both objects. The operation was successful. The vertices of the RGB model were now colored according to the values of the thermal imaging (Figure 12). In this way, the thermal vision of the point cloud was integrated with the three-dimensional RGB model of the church building in Porąbka.

4. Conclusions

The integration of RGB data with thermal imaging allows for much greater use of the potential of the model created from thermograms. The thermal imaging camera in its assumption has a relatively low resolution compared to the real color imaging sensor, the resolution of which is as much as 64 times higher. This means that the target point cloud (and model) is much more accurate, and the GSD (pixel size in the field) is about 10 times smaller. Therefore, the combination of both datasets allows for very good results. Thanks to this, it is possible to carry out a more accurate and detailed thermal analysis of the facility and to determine the heat escape route from the building. Unfortunately, the solution is not cheap and requires a lot of work. First of all, to perform such complex calculations a very advanced unit of computing is required: a 12-core (24 threads) AMD Ryzen 9 3900X processor clocked at 4.4 GHz was used for data processing. In short, it is a device designed to perform a large number of parallel and heavy operations. The computer is equipped with a graphics card based on the NVIDIA RTX 2080 Super chipset. It is a card with 12 GB of built-in memory and GDDR6 transmission standard. In addition, up to 32 GB of RAM memory (random-access memory) was used. In addition, very expensive measuring equipment is required. Both the DJI Matrice 300 RTK drone and the hand-held thermal imager are quite an expensive solution. The Matrice 300 RTK unmanned aerial vehicle with a thermal imaging camera requires exceptional flying skills from the operator, as it is a very large machine that cannot be easily operated in the airspace. Additionally, the ability to process data is also not a simple challenge. During the post-processing of the obtained data, it was repeatedly analyzed whether it was possible to create a dense cloud of points and a model from such acquired photos. Regardless of this, however, the original assumption of the project was achieved. As a result of the conducted research, it has been proven that the integration of thermovision and photogrammetric data is not only possible, but also very useful. The model by itself, made of thermal images, leaves much to be desired. Only thanks to integration, can the data be put to use. Nevertheless, the advantages of such a procedure are very promising. A significant increase in the resolution, accuracy, and precision of the created model, along with leaving information about the thermal imaging parameters of the measured object, is the main advantage of using such a huge amount of data and implementing integration.
The novelty of the proposed approach is the integration of thermal and RGB data and the generation of a 3D thermal model of a very complex and tall object. Innovative active thermal signs of the photogrammetric control network were prepared and applied. In addition, obtaining data by a large drone, such as the Matrice 300 RTK, is very difficult for historic, valuable, and complicated objects. The solution is dedicated to thermal monitoring of historic buildings. Additional manual thermal imaging has been added to perform a thermal analysis of the object. Thus, the following conclusions could be drawn: no significant escape was observed in the attic and church roof thermal. This is probably related to the recent renovation of the temple’s covering and its cover additional warming. For windows, especially such large ones, it can never be guaranteed to be completely insulated, as glass is generally more susceptible to sunlight and transmits heat to the interior of the object.
The presented approach is a valuable source of information on the energetic characteristic aspects of the buildings. In this paper, the method of thermal and visual data integration is presented with the example of a heritage object, yet the method is universal, and it can be applied to any kind of object, in which areas of energy losses have to be determined. Each technology has its advantages and disadvantages, and the real art is the ability to use the positive aspects and combine them with each other to create even more advanced and complex imagery.

Author Contributions

Conceptualization, J.P. and A.R.; methodology, J.P. and A.R.; software, J.P.; validation, J.P. and A.R.; formal analysis, J.P.; investigation, J.P.; resources, J.P.; data curation, A.R.; writing—original draft preparation, J.P. and A.R.; writing—review and editing, J.P. and A.R.; visualization, J.P.; supervision, A.R.; project administration, J.P. and A.R.; funding acquisition, J.P. and A.R.; All authors have read and agreed to the published version of the manuscript.

Funding

This study was carried out with the financial support of AGH University of Science and Technology grant No.: 16.16.150.545: “Spatial engineering, photogrammetry and remote sensing for the needs of science, economy and administration”.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors are thankful to the NaviGate sp. z.o.o. Company and Andrzej Gabryś Mapping Services for their support in obtaining the data for this research.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Grinzato, E. IR Thermography Applied to the Cultural Heritage Conservation. In Proceedings of the 18th World Conference on Nondestructive Testing, Durban, South Africa, 16–20 April 2012; pp. 2–5. [Google Scholar]
  2. Akçay, Ö. Photogrammetric Analysis of Multispectral and Thermal Close-Range Images. Mersin Photogramm. J. 2021, 3, 29–36. [Google Scholar] [CrossRef]
  3. Borrmann, D.; Elseberg, J.; Nüchter, A. Thermal 3D Mapping of Building Façades. Adv. Intell. Syst. Comput. 2013, 193, 173–182. [Google Scholar] [CrossRef]
  4. Erenoglu, R.C.; Akcay, O.; Erenoglu, O. An UAS-Assisted Multi-Sensor Approach for 3D Modeling and Reconstruction of Cultural Heritage Site. J. Cult. Herit. 2017, 26, 79–90. [Google Scholar] [CrossRef]
  5. Zumr, D.; David, V.; Jeřábek, J.; Noreika, N.; Krása, J. Monitoring of the Soil Moisture Regime of an Earth-Filled Dam by Means of Electrical Resistance Tomography, Close Range Photogrammetry, and Thermal Imaging. Environ. Earth Sci. 2020, 79, 299. [Google Scholar] [CrossRef]
  6. Gulbe, L.; Caune, V.; Korats, G. Urban Area Thermal Monitoring: Liepaja Case Study Using Satellite and Aerial Thermal Data. Int. J. Appl. Earth Obs. Geoinf. 2017, 63, 45–54. [Google Scholar] [CrossRef]
  7. Rosina, E.; Spodek, J. Using Infrared Thermography to Detect Moisture in Historic Masonry: A Case Study in Indiana. APT Bull. 2003, 34, 11. [Google Scholar] [CrossRef]
  8. Poksinska, M.; Cupa, A.; Socha-Bystron, S. Thermography in the Investigation of Gilding on Historical Wall Paintings. In Proceedings of the 9th International Conference on Quantitative InfraRed Thermography, Krakow, Poland, 2–5 July 2008; pp. 2–7. [Google Scholar] [CrossRef]
  9. Sidiropoulou-Velidou, D.; Georgopoulos, A.; Lerma, J.L. Exploitation of Thermal Imagery for the Detection of Pathologies in Monuments; Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2012; Volume 7616, pp. 97–108. [Google Scholar] [CrossRef]
  10. Mercuri, F.; Orazi, N.; Paoloni, S.; Cicero, C.; Zammit, U. Pulsed Thermography Applied to the Study of Cultural Heritage. Appl. Sci. 2017, 7, 1010. [Google Scholar] [CrossRef]
  11. Tang, Q.J.; Liu, J.Y.; Wang, Y. Theoretical and Experimental Study on Nondestructive Pulse Phase Infrared Thermography Testing Technology. Adv. Mater. Res. 2011, 314-316, 1483–1486. [Google Scholar] [CrossRef]
  12. Wu, D.; Busse, G. Lock-in Thermography for Nondestructive Evaluation of Materials. Rev. Gen. Therm. 1998, 37, 693–703. [Google Scholar] [CrossRef]
  13. Peřina, Z.; Fabian, R.; Wolfová, M.; Valíček, P.; Panovec, V. Non-Destructive Defectoscopy of Building Structures by Lock-In Thermography. Adv. Mater. Res. 2015, 1122, 173–176. [Google Scholar] [CrossRef]
  14. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef] [Green Version]
  15. Xu, Z.; Shen, X.; Cao, L.; Coops, N.C.; Goodbody, T.R.H.; Zhong, T.; Zhao, W.; Sun, Q.; Ba, S.; Zhang, Z.; et al. Tree Species Classification Using UAS-Based Digital Aerial Photogrammetry Point Clouds and Multispectral Imageries in Subtropical Natural Forests. Int. J. Appl. Earth Obs. Geoinf. 2020, 92, 102173. [Google Scholar] [CrossRef]
  16. Edelman, G.J.; Aalders, M.C. Photogrammetry Using Visible, Infrared, Hyperspectral and Thermal Imaging of Crime Scenes. Forensic Sci. Int. 2018, 292, 181–189. [Google Scholar] [CrossRef] [PubMed]
  17. Balaras, C.A.; Argiriou, A.A. Infrared Thermography for Building Diagnostics. Energy Build. 2002, 34, 171–183. [Google Scholar] [CrossRef]
  18. Grinzato, E.; Vavilov, V.; Kauppinen, T. Quantitative Infrared Thermography in Buildings. Energy Build. 1998, 29, 1–9. [Google Scholar] [CrossRef]
  19. Lerma, J.L.; Cabrelles, M.; Portalés, C. Multitemporal Thermal Analysis to Detect Moisture on a Building Faade. Constr. Build. Mater. 2011, 25, 2190–2197. [Google Scholar] [CrossRef]
  20. De Smet, T.; Nikulin, A.; Frazer, W.; Baur, J.; Abramowitz, J.; Finan, D.; Denara, S.; Aglietti, N.; Campos, G. Drones and “Butterflies”: A Low-Cost UAV System for Rapid Detection and Identification of Unconventional Minefields. J. Conv. Weapons Destr. 2018, 22, 50–58. [Google Scholar]
  21. Nikulin, A.; de Smet, T.S.; Baur, J.; Frazer, W.D.; Abramowitz, J.C. Detection and Identification of Remnant PFM-1 “Butterfly Mines” with a UAV-Based Thermal-Imaging Protocol. Remote Sens. 2018, 10, 1672. [Google Scholar] [CrossRef] [Green Version]
  22. Jebens, B.M.; Sawada, H.; Ph, D.; Shen, J.; Tollefsen, E. To What Extent Could the Development of an Airborne Thermal Imaging Detection System Contribute to Enhance Detection? J. Conv. Weapons Destr. 2020, 24, 63–67. [Google Scholar]
  23. Zefri, Y.; Elkettani, A.; Sebari, I.; Lamallam, S.A. Thermal Infrared and Visual Inspection of Photovoltaic Installations by Uav Photogrammetry—Application Case: Morocco. Drones 2018, 2, 41. [Google Scholar] [CrossRef] [Green Version]
  24. Borrmann, D.; Nüchter, A.; Dakulović, M.; Maurović, I.; Petrović, I.; Osmanković, D.; Velagić, J. A Mobile Robot Based System for Fully Automated Thermal 3D Mapping. Adv. Eng. Inform. 2014, 28, 425–440. [Google Scholar] [CrossRef]
  25. Brumana, R.; Oreni, D.; Van Hecke, L.; Barazzetti, L.; Previtali, M.; Roncoroni, F.; Valente, R. Combined Geometric and Thermal Analysis from UAV Platforms for Archaeological Heritage Documentation. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, 2, 55–60. [Google Scholar] [CrossRef] [Green Version]
  26. Christiansen, P.; Steen, K.A.; Jørgensen, R.N.; Karstoft, H. Automated Detection and Recognition of Wildlife Using Thermal Cameras. Sensors 2014, 14, 13778–13793. [Google Scholar] [CrossRef] [PubMed]
  27. Matese, A.; Di Gennaro, S.F. Practical Applications of a Multisensor UAV Platform Based on Multispectral, Thermal and RGB High Resolution Images in Precision Viticulture. Agriculture 2018, 8, 116. [Google Scholar] [CrossRef] [Green Version]
  28. Svanström, F.; Alonso-Fernandez, F.; Englund, C. A Dataset for Multi-Sensor Drone Detection. Data Brief 2021, 39, 107521. [Google Scholar] [CrossRef]
  29. Stobbelaar, P. Prediction of Leaf Area Index Using the Integration of the Thermal Infrared with Visible and Near-Infrared Data Acquired with an UAV for a Mixed Forest. Master’s Thesis, University of Twente, Enschede, The Netherlands, 2021. [Google Scholar]
  30. Ermenyi, T.; Enegbuma, W.I.; Isaacs, N.; Potangaroa, R. Unmanned Aerial Vehicle Sustainability Assessment of Heritage Buildings. In Proceedings of the 54th International Conference of the Architectural Science Association (ANZAScA), Auckland, New Zealand, 26–27 November 2020; pp. 1323–1330. [Google Scholar]
  31. Krawczyk, J.; Mazur, A.; Sasin, T.; Stokłosa, A. Infrared Building Inspection with Unmanned Aerial Vehicles. Trans. Inst. Aviat. 2015, 240, 32–48. [Google Scholar] [CrossRef]
  32. Lagüela, S.; Díaz-Vilarino, L.; Roca, D.; Lorenzo, H. Aerial Thermography from Low-Cost UAV for the Generation of Thermographic Digital Terrain Models. Opto-Electron. Rev. 2015, 23, 76–82. [Google Scholar] [CrossRef] [Green Version]
  33. Patrucco, G.; Cortese, G.; Giulio Tonolo, F.; Spanò, A. Thermal and Optical Data Fusion Supporting Built Heritage Analyses. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.—ISPRS Arch. 2020, 43, 619–626. [Google Scholar] [CrossRef]
  34. Rakha, T.; Gorodetsky, A. Review of Unmanned Aerial System (UAS) Applications in the Built Environment: Towards Automated Building Inspection Procedures Using Drones. Autom. Constr. 2018, 93, 252–264. [Google Scholar] [CrossRef]
  35. Van der Sluijs, J.; Kokelj, S.V.; Fraser, R.H.; Tunnicliffe, J.; Lacelle, D. Permafrost Terrain Dynamics and Infrastructure Impacts Revealed by UAV Photogrammetry and Thermal Imaging. Remote Sens. 2018, 10, 1734. [Google Scholar] [CrossRef] [Green Version]
  36. Wakeford, Z.E.; Chmielewska, M.; Hole, M.J.; Howell, J.A.; Jerram, D.A. Combining Thermal Imaging with Photogrammetry of an Active Volcano Using UAV: An Example from Stromboli, Italy. Photogramm. Rec. 2019, 34, 445–466. [Google Scholar] [CrossRef]
  37. Raeva, P.L.; Šedina, J.; Dlesk, A. Monitoring of Crop Fields Using Multispectral and Thermal Imagery from UAV. Eur. J. Remote Sens. 2019, 52, 192–201. [Google Scholar] [CrossRef] [Green Version]
  38. Mader, D.; Blaskow, R.; Westfeld, P.; Weller, C. Potential of UAV-Based Laser Scanner and Multispectral Camera Data in Building Inspection. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.—ISPRS Arch. 2016, 41, 1135–1142. [Google Scholar] [CrossRef] [Green Version]
  39. Khelifi, A.; Ciccone, G.; Altaweel, M.; Basmaji, T. Autonomous Service Drones for Multimodal Detection and Monitoring of Archaeological Sites. Appl. Sci. 2021, 11, 10424. [Google Scholar] [CrossRef]
  40. Thi, T.; Takahashi, H.; Toriu, T.; Ham, H. Fusion of Infrared and Visible Images for Robust Person Detection. In Image Fusion; Ukimura, O., Ed.; InTech: Rijeka, Croatia, 2011; pp. 239–264. [Google Scholar]
  41. González-Aguilera, D.; Rodriguez-Gonzalvez, P.; Armesto, J.; Lagüela, S. Novel Approach to 3D Thermography and Energy Efficiency Evaluation. Energy Build. 2012, 54, 436–443. [Google Scholar] [CrossRef]
  42. Hoegner, L.; Abmayr, T.; Tosic, D.; Turzer, S.; Stilla, U. Fusion of 3D Point Clouds with TIR Images for Indoor Scene Reconstruction. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.—ISPRS Arch. 2018, 42, 189–194. [Google Scholar] [CrossRef] [Green Version]
  43. Wang, C.; Cho, Y.K.; Gai, M. As-Is 3D Thermal Modeling for Existing Building Envelopes Using a Hybrid LIDAR System. J. Comput. Civ. Eng. 2013, 27, 645–656. [Google Scholar] [CrossRef]
  44. Narváez, F.J.Y.; del Pedregal, J.S.; Prieto, P.A.; Torres-Torriti, M.; Cheein, F.A.A. LiDAR and Thermal Images Fusion for Ground-Based 3D Characterisation of Fruit Trees. Biosyst. Eng. 2016, 151, 479–494. [Google Scholar] [CrossRef]
  45. Zhu, J.; Xu, Y.; Ye, Z.; Hoegner, L.; Stilla, U. Fusion of Urban 3D Point Clouds with Thermal Attributes Using MLS Data and TIR Image Sequences. Infrared Phys. Technol. 2021, 113, 103622. [Google Scholar] [CrossRef]
  46. Lagüela, S.; Díaz-Vilariño, L.; Martínez, J.; Armesto, J. Automatic Thermographic and RGB Texture of As-Built BIM for Energy Rehabilitation Purposes. Autom. Constr. 2013, 31, 230–240. [Google Scholar] [CrossRef]
  47. Lin, D.; Jarzabek-Rychard, M.; Tong, X.; Maas, H.G. Fusion of Thermal Imagery with Point Clouds for Building Façade Thermal Attribute Mapping. ISPRS J. Photogramm. Remote Sens. 2019, 151, 162–175. [Google Scholar] [CrossRef]
  48. Dino, I.G.; Sari, A.E.; Iseri, O.K.; Akin, S.; Kalfaoglu, E.; Erdogan, B.; Kalkan, S.; Alatan, A.A. Image-Based Construction of Building Energy Models Using Computer Vision. Autom. Constr. 2020, 116, 103231. [Google Scholar] [CrossRef]
  49. Lagüela, S.; Armesto, J.; Arias, P.; Herráez, J. Automation of Thermographic 3D Modelling through Image Fusion and Image Matching Techniques. Autom. Constr. 2012, 27, 24–31. [Google Scholar] [CrossRef]
  50. Ribarić, S.; Marčetić, D.; Vedrina, D.S. A Knowledge-Based System for the Non-Destructive Diagnostics of Façade Isolation Using the Information Fusion of Visual and IR Images. Expert Syst. Appl. 2009, 36, 3812–3823. [Google Scholar] [CrossRef]
  51. Wang, X.; Witz, J.F.; El Bartali, A.; Jiang, C. Infrared Thermography Coupled with Digital Image Correlation in Studying Plastic Deformation on the Mesoscale Level. Opt. Lasers Eng. 2016, 86, 264–274. [Google Scholar] [CrossRef]
  52. Biass, S.; Orr, T.R.; Houghton, B.F.; Patrick, M.R.; James, M.R.; Turner, N. Insights Into Pāhoehoe Lava Emplacement Using Visible and Thermal Structure-From-Motion Photogrammetry. J. Geophys. Res. Solid Earth 2019, 124, 5678–5695. [Google Scholar] [CrossRef] [Green Version]
  53. Yang, M.D.; Su, T.C.; Lin, H.Y. Fusion of Infrared Thermal Image and Visible Image for 3D Thermal Model Reconstruction Using Smartphone Sensors. Sensors 2018, 18, 2003. [Google Scholar] [CrossRef] [Green Version]
  54. Schramm, S.; Rangel, J.; Kroll, A. Data Fusion for 3D Thermal Imaging Using Depth and Stereo Camera for Robust Self-Localization. In Proceedings of the 2018 IEEE Sensors Applications Symposium (SAS), Seoul, Korea, 12–14 March 2018; pp. 1–6. [Google Scholar] [CrossRef]
  55. Vidas, S.; Moghadam, P.; Bosse, M. 3D Thermal Mapping of Building Interiors Using an RGB-D and Thermal Camera. In Proceedings of the IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 2311–2318. [Google Scholar] [CrossRef] [Green Version]
  56. Yamaguchi, M.; Truong, T.P.; Mori, S.; Nozick, V.; Saito, H.; Yachida, S.; Sato, H. Superimposing Thermal-Infrared Data on 3D Structure Reconstructed by RGB Visual Odometry. IEICE Trans. Inf. Syst. 2018, 101, 1296–1307. [Google Scholar] [CrossRef] [Green Version]
  57. Alba, M.I.; Barazzetti, L.; Scaioni, M.; Rosina, E.; Previtali, M. Mapping Infrared Data on Terrestrial Laser Scanning 3D Models of Buildings. Remote Sens. 2011, 3, 1847–1870. [Google Scholar] [CrossRef] [Green Version]
  58. Adan, A.; Prado, T.; Prieto, S.A.; Quintana, B. Fusion of Thermal Imagery and LiDAR Data for Generating TBIM Models. In Proceedings of the IEEE Sensors, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar] [CrossRef]
  59. Cabrelles, M.; Galcerá, S.; Navarro, S.; Lerma, J.L.; Akasheh, T.; Haddad, N. Integration of 3D Laser Scanning, Photogrammetry and Thermography to Record Architectural Monuments. In Proceedings of the 22nd International CIPA Symposium, Kyoto, Japan, 11–15 October 2009; Volume 9, pp. 3–8. [Google Scholar]
  60. Macher, H.; Boudhaim, M.; Grussenmeyer, P.; Siroux, M.; Landes, T. Combination of thermal and geometric information for bim enrichment. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.—ISPRS Arch. 2019, 42, 719–725. [Google Scholar] [CrossRef] [Green Version]
  61. Merchán, P.; Adán, A.; Salamanca, S.; Domínguez, V.; Chacón, R. Geometric and Colour Data Fusion for Outdoor 3D Models. Sensors 2012, 12, 6893–6919. [Google Scholar] [CrossRef] [PubMed]
  62. Merchán, P.; Merchán, M.J.; Salamanca, S.; Adán, A. Application of Multisensory Technology for Resolution of Problems in the Field of Research and Preservation of Cultural Heritage; Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2018; Volume 10754, pp. 32–47. [Google Scholar] [CrossRef]
  63. Rea, P.; Ottaviano, E. Design and Development of an Inspection Robotic System for Indoor Applications. Robot. Comput.-Integr. Manuf. 2018, 49, 143–151. [Google Scholar] [CrossRef]
  64. Dlesk, A.; Vach, K.; Holubec, P. Usage of Photogrammetric Processing of Thermal Images for Civil Engineers. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.—ISPRS Arch. 2018, 42, 99–103. [Google Scholar] [CrossRef] [Green Version]
  65. Scaioni, M.; Rosina, E.; L’erario, A.; Dìaz-Vilariño, L. Integration of Infrared Thermography & Photogrammetric Surveying of Built Landscape. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci.—ISPRS Arch. 2017, 42, 153–160. [Google Scholar] [CrossRef] [Green Version]
  66. Ham, Y.; Golparvar-Fard, M. An Automated Vision-Based Method for Rapid 3D Energy Performance Modeling of Existing Buildings Using Thermal and Digital Imagery. Adv. Eng. Inform. 2013, 27, 395–409. [Google Scholar] [CrossRef]
  67. Truong, T.P.; Yamaguchi, M.; Mori, S.; Nozick, V.; Saito, H. Registration of RGB and Thermal Point Clouds Generated by Structure from Motion. In Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops, ICCVW, Venice, Italy, 22–29 October 2017; pp. 419–427. [Google Scholar] [CrossRef] [Green Version]
  68. Hoegner, L.; Stilla, U. Building Facade Object Detection from Terrestrial Thermal Infrared Image Sequences Combining Different Views. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 2, 55–62. [Google Scholar] [CrossRef] [Green Version]
  69. Weinmann, M.; Leitloff, J.; Hoegner, L.; Jutzi, B.; Stilla, U.; Hinz, S. Thermal 3D Mapping for Object Detection in Dynamic Scenes. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, 2, 53–60. [Google Scholar] [CrossRef] [Green Version]
  70. Garrido, I.; Lagüela, S.; Arias, P.; Balado, J. Thermal-Based Analysis for the Automatic Detection and Characterization of Thermal Bridges in Buildings. Energy Build. 2018, 158, 1358–1367. [Google Scholar] [CrossRef]
  71. González-Aguilera, D.; Lagüela, S.; Rodríguez-Gonzálvez, P.; Hernández-López, D. Image-Based Thermographic Modeling for Assessing Energy Efficiency of Buildings Façades. Energy Build. 2013, 65, 29–36. [Google Scholar] [CrossRef]
  72. Kristoffersen, M.S.; Dueholm, J.V.; Gade, R.; Moeslund, T.B. Pedestrian Counting with Occlusion Handling Using Stereo Thermal Cameras. Sensors 2016, 16, 62. [Google Scholar] [CrossRef] [Green Version]
  73. Sentenac, T.; Bugarin, F.; Ducarouge, B.; Devy, M. Automated Thermal 3D Reconstruction Based on a Robot Equipped with Uncalibrated Infrared Stereovision Cameras. Adv. Eng. Inform. 2018, 38, 203–215. [Google Scholar] [CrossRef]
  74. Adamopoulos, E.; Volinia, M.; Girotto, M.; Rinaudo, F. Three-Dimensional Thermal Mapping from IRT Images for Rapid Architectural Heritage NDT. Buildings 2020, 10, 187. [Google Scholar] [CrossRef]
  75. Son, H.; Kim, C.; Choi, H. High-Quality as-Is 3D Thermal Modeling in MEP Systems Using a Deep Convolutional Network. Adv. Eng. Inform. 2019, 42, 100999. [Google Scholar] [CrossRef]
  76. Iwaszczuk, D.; Stilla, U. Camera Pose Refinement by Matching Uncertain 3D Building Models with Thermal Infrared Image Sequences for High Quality Texture Extraction. ISPRS J. Photogramm. Remote Sens. 2017, 132, 33–47. [Google Scholar] [CrossRef]
  77. Conte, P.; Girelli, V.A.; Mandanici, E. Structure from Motion for Aerial Thermal Imagery at City Scale: Pre-Processing, Camera Calibration, Accuracy Assessment. ISPRS J. Photogramm. Remote Sens. 2018, 146, 320–333. [Google Scholar] [CrossRef]
  78. Hoegner, L.; Stilla, U. Mobile Thermal Mapping for Matching of Infrared Images with 3D Building Models and 3D Point Clouds. Quant. InfraRed Thermogr. J. 2018, 15, 252–270. [Google Scholar] [CrossRef]
  79. Hou, Y.; Chen, M.; Volk, R.; Soibelman, L. Investigation on Performance of RGB Point Cloud and Thermal Information Data Fusion for 3D Building Thermal Map Modeling Using Aerial Images under Different Experimental Conditions. J. Build. Eng. 2022, 45, 103380. [Google Scholar] [CrossRef]
  80. Jeong, J.H.; Jae, J.Y.; Wang, L.J.; Hong, O.J. Dense Thermal 3d Point Cloud Generation of Building Envelope by Drone-Based Photogrammetry. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2021, 39, 73–79. [Google Scholar] [CrossRef]
  81. Hu, Y.; Liang, Y.; Tao, T.; Feng, S.; Zuo, C.; Zhang, Y.; Chen, Q. Dynamic 3D Measurement of Thermal Deformation Based on Geometric-Constrained Stereo-Matching with a Stereo Microscopic System. Meas. Sci. Technol. 2019, 30, 125007. [Google Scholar] [CrossRef]
  82. DJI. Matrice 300 RTK User Manual; DJI: Shenzhen, China, 2020. [Google Scholar]
  83. Sonel, S.A. User Manual. Thermal Imaging Cameras KT-165, KT-250 and KT-320; Sonel SA: Swidnica, Poland, 2018. [Google Scholar]
  84. DJI. Phantom 4 PRO/PRO+; DJI: Shenzhen, China, 2016. [Google Scholar]
Figure 1. (a) Matrice 300 RTK, (b) Phantom 4 PRO, (c) KT-165.
Figure 1. (a) Matrice 300 RTK, (b) Phantom 4 PRO, (c) KT-165.
Energies 15 04971 g001
Figure 2. Preparation of the thermovision warp.
Figure 2. Preparation of the thermovision warp.
Energies 15 04971 g002
Figure 3. Fixing the warp points on the church buttresses.
Figure 3. Fixing the warp points on the church buttresses.
Energies 15 04971 g003
Figure 4. Distribution plan of points of the tacheometric and photogrammetric control networks, which also functions as a thermovision control network.
Figure 4. Distribution plan of points of the tacheometric and photogrammetric control networks, which also functions as a thermovision control network.
Energies 15 04971 g004
Figure 5. Diagram of the drone raid.
Figure 5. Diagram of the drone raid.
Energies 15 04971 g005
Figure 6. (a) Photos from the KT-165 thermal imaging camera with the side door of the church, (b) RGB image with the same side door of the church.
Figure 6. (a) Photos from the KT-165 thermal imaging camera with the side door of the church, (b) RGB image with the same side door of the church.
Energies 15 04971 g006
Figure 7. (a) Photographs from the KT-165 camera showing heaters under the side benches: thermal image, (b) Pictures from the KT-165 camera showing heaters under the side benches: RGB image.
Figure 7. (a) Photographs from the KT-165 camera showing heaters under the side benches: thermal image, (b) Pictures from the KT-165 camera showing heaters under the side benches: RGB image.
Energies 15 04971 g007
Figure 8. The distance between the thermal imaging point cloud and that obtained from the Phantom 4 PRO, on the example of the clock tower.
Figure 8. The distance between the thermal imaging point cloud and that obtained from the Phantom 4 PRO, on the example of the clock tower.
Energies 15 04971 g008
Figure 9. The distance between the “wide” point cloud and the acquired “wide + zoom”, on the example of a support at the entrance to the church.
Figure 9. The distance between the “wide” point cloud and the acquired “wide + zoom”, on the example of a support at the entrance to the church.
Energies 15 04971 g009
Figure 10. RGB Model of church.
Figure 10. RGB Model of church.
Energies 15 04971 g010
Figure 11. Thermal imaging model.
Figure 11. Thermal imaging model.
Energies 15 04971 g011
Figure 12. Integration of the RGB model with the Phantom 4 PRO with the point cloud of thermal images from the Matrice 300 RTK.
Figure 12. Integration of the RGB model with the Phantom 4 PRO with the point cloud of thermal images from the Matrice 300 RTK.
Energies 15 04971 g012
Table 1. DJI Phantom 4 PRO photos alignment results.
Table 1. DJI Phantom 4 PRO photos alignment results.
AccuracyReference Preselection# Of
Tie Points
Reprojection Error
[pix]
LowestEstimated30,1954.50
LowestSequential41063.66
LowestSource22,0104.16
LowEstimated300,9042.09
LowSequential74,7422.31
LowSource260,4311.96
MediumEstimated612,1480.93
MediumSequential171,2350.80
MediumSource591,3790.90
HighEstimated636,7230.61 *
HighSequential180,5570.46
HighSource662,5340.59
HighestEstimated660,1900.40
HighestSequential165,0450.35
HighestSource667,9000.39
* The variant chosen for dense matching.
Table 2. DJI Matrice 300 RTK wide photos alignment results.
Table 2. DJI Matrice 300 RTK wide photos alignment results.
AccuracyReference Preselection# Of
Tie Points
Reprojection Error
[pix]
LowestEstimated39306.49
LowestSequential97527.47
LowestSource12,8407.01
LowEstimated373,994big error
LowSequential151,7822.73
LowSource385,430big error
MediumEstimated669,967big error
MediumSequential309,5311.4
MediumSource754,74133.9
HighEstimated546,0631.11 *
HighSequential282,05936.32
HighSource741,908big error
* The variant chosen for dense matching.
Table 3. DJI Matrice 300 RTK wide + zoom photos alignment results.
Table 3. DJI Matrice 300 RTK wide + zoom photos alignment results.
AccuracyReference Preselection# Of
Tie Points
Reprojection Error
[pix]
LowestEstimated10,7023.07
LowestSequential25,480big error
LowestSource71,7254.28
LowEstimated508,3482.48
LowSequential625,091big error
LowSource1,151,574big error
MediumEstimatedno resultno result
MediumSequential619,0521.72
MediumSource897,0471.76
HighEstimatedno resultno result
HighSequential989,9691.52 *
HighSource618,178big error
* The variant chosen for dense matching.
Table 4. DJI Matrice 300 RTK thermal photos alignment results.
Table 4. DJI Matrice 300 RTK thermal photos alignment results.
PartAccuracyReference Preselection# Of
Tie Points
Reprojection Error
[pix]
1HighEstimated74590.56
HighSequential80690.58
HighSource87250.61
HighestEstimated16,3840.47 *
HighestSequential20,1370.51
HighestSource84140.50
2HighEstimated41,3800.63
HighSequential32,8820.59
HighSource27,1070.55
HighestEstimated110,4620.50 *
HighestSequential101,4710.49
HighestSource24,0270.40
3HighEstimated10,3040.49
HighSequential24,0310.48
HighSource21,8680.50
HighestEstimated39,6840.45 *
HighestSequential56,5460.39
HighestSource28,8750.39
4HighEstimated52,0080.58
HighSequential56,1250.60
HighSource42,1490.71
HighestEstimated78,2860.47
HighestSequential45,5411.84 *
HighestSource31,3840.48
5HighEstimated8211.18
HighSequential7361.34
HighSource8141.30
HighestEstimated13520.78
HighestSequential19330.86 *
HighestSource42630.46
* The variants chosen for dense matching and merging.
Table 5. Dense clouds parameters.
Table 5. Dense clouds parameters.
ModelGSD [mm/pix]RMSE X
[m]
RMSE Y
[m]
RMSE Z
[m]
# Of Points
Phantom 4 PRO3.120.0030.0040.006487,900
“Wide” photos3.910.0030.0040.005546,063
“Zoom + wide” photos3.670.0050.0050.006712,692
Thermovision part 1 0.0070.0110.01316,384
Thermovision part 2 0.0440.0410.041101,471
Thermovision part 39.640.0060.0070.01356,546
Thermovision part 4 0.0100.0120.01478,286
Thermovision part 5 0.0280.0990.18240,071
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Paziewska, J.; Rzonca, A. Integration of Thermal and RGB Data Obtained by Means of a Drone for Interdisciplinary Inventory. Energies 2022, 15, 4971. https://0-doi-org.brum.beds.ac.uk/10.3390/en15144971

AMA Style

Paziewska J, Rzonca A. Integration of Thermal and RGB Data Obtained by Means of a Drone for Interdisciplinary Inventory. Energies. 2022; 15(14):4971. https://0-doi-org.brum.beds.ac.uk/10.3390/en15144971

Chicago/Turabian Style

Paziewska, Joanna, and Antoni Rzonca. 2022. "Integration of Thermal and RGB Data Obtained by Means of a Drone for Interdisciplinary Inventory" Energies 15, no. 14: 4971. https://0-doi-org.brum.beds.ac.uk/10.3390/en15144971

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop