Next Article in Journal
Aerodynamic Numerical Simulation Analysis of Water–Air Two-Phase Flow in Trans-Medium Aircraft
Next Article in Special Issue
A Data Normalization Technique for Detecting Cyber Attacks on UAVs
Previous Article in Journal
Capacity Optimization of Next-Generation UAV Communication Involving Non-Orthogonal Multiple Access
Previous Article in Special Issue
Adaptive Robust Control for Quadrotors with Unknown Time-Varying Delays and Uncertainties in Dynamics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Respiration Detection of Ground Injured Human Target Using UWB Radar Mounted on a Hovering UAV

1
Department of Military Biomedical Engineering, Air Force Medical University, Xi’an 710032, China
2
School of Basic Medicine, Air Force Medical University, Xi’an 710032, China
3
Drug and Instrument Supervisory & Test Station of PLA Xining Joint Logistics Support Centre, Lanzhou 730050, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 3 August 2022 / Revised: 31 August 2022 / Accepted: 1 September 2022 / Published: 3 September 2022
(This article belongs to the Special Issue Conceptual Design, Modeling, and Control Strategies of Drones-II)

Abstract

:
As an important and basic platform for remote life sensing, unmanned aerial vehicles (UAVs) may hide the vital signals of an injured human due to their own motion. In this work, a novel method to remove the platform motion and accurately extract human respiration is proposed. We utilized a hovering UAV as the platform of ultra-wideband (UWB) radar to capture human respiration. To remove interference from the moving UAV platform, we used the delay calculated by the correlation between each frame of UWB radar data in order to compensate for the range migration. Then, the echo signals from the human target were extracted as the observed multiple range channel signals. Owing to meeting the independent component analysis (ICA), we adopted ICA to estimate the signal of respiration. The results of respiration detection experiments conducted in two different outdoor scenarios show that our proposed method could accurately separate respiration of a ground human target without any additional sensor and prior knowledge; this physiological information will be essential for search and rescue (SAR) missions.

1. Introduction

In the case of search and rescue (SAR) missions for post-disaster survivors and outdoor injured people, the survival rate drops rapidly with the consumption of search time [1,2]. Hence, quick and efficient life detection technology has always been a hot research topic in emergency rescue medicine. In the extreme searching environment, there are potential risks that mean that rescuers cannot stay for a long time in the disaster site, or that rescuers cannot arrive at the scene immediately due to seriously damaged traffic [3,4], which will delay the golden time of SAR.
Air-to-ground life detection technology is an effective resolvent for extreme environmental human target search that enhances the capability of SAR efficiency. Due to the characteristics of simple operation, flexible movement and no casualty risk [5,6,7], unmanned aerial vehicles (UAVs) carrying several sensors, such as high-definition camera [8,9], thermal imaging camera [10,11] and multi-spectral camera [12], quickly arrive at the disaster site to search and locate the injured people. However, these sensors are based on the features of optics and morphology that make it hard to acquire physiological information. This renders them unable to infer the life state of victims and can not provide a reliable basis for the formulation of a rescue plan and resource allocation. Therefore, remote acquisition of the injured people’s vital signals is of great significance for SAR missions.
To this end, a mission-chain-driven UAV swarm was first proposed by our previous study [13]. First, the UAVs equipped with high-definition cameras searched for human targets and obtained the location. Then, the bio-radar carried by another UAV was released at this location to detect the respiratory signal of the human target. The throwing method may damage the bio-radar to reduce efficiency, causing the huge consumption of SAR resources. Hence, in this paper, we use an airborne radar sensor to observe the vital signals, which is the weak displacement of human chest caused by respiratory activity. This requires a platform with good time stability. However, a new problem emerges. Owing to wind and communication quality limitation in practical application, the UAV platform may be unstable in hovering mode, which will change the location of the human target in the radar echo data. On the other hand, the micro motion signal of the human will be submerged in the platform motion. Therefore, in order to realize the effective detection of vital signals using airborne radar, it is very important to eliminate the interference of platform motion.
To our best knowledge, several studies have made tremendous effort to conduct the issue of UAV platform motion compensation. Islam et al. proposed a dual radar system: the first radar points at the human target to sense respiratory motion, and the second radar points at the stationary wall (ceiling) to measure the platform motion [14]. Then, the signals collected using the two radars are used as the input signal and the reference signal of the adaptive filter, respectively, to eliminate platform motion interference, and the respiratory signal was successfully recovered. However, the experiment was carried out on a mechanical vibration simulator built indoors, which needs to be verified on real UAVs, and the system requires that there must be a stationary wall (ceiling) in the environment, so the application scenarios are very limited. Rong et al. proposed a background residual method, in which the echo of static background (ground) is regarded as the platform movement signal and the vital signal is obtained by the phase residual between the echo signal of the human target and the echo signal of static ground. The effectiveness of the method was verified on a real UAV platform. Nevertheless, in data processing, it is difficult to automatically select the range bin where the static background is located, and when there is other interference in the ground range, the interference will be introduced into the detection results [15]. Rohman et al. also used a UAV as the moving platform for radar, but their study did not focus on platform motion interference cancellation [16]. In general, study on vital signal detection using airborne radar is less frequent, and it is in the stage of laboratory research. There is an enormous challenge that remains associated with the interference of the unstable UAV platform.
In this paper, we built a UAV-mounted ultra-wideband (UWB) radar system, which hovers at low altitude to collect data from ground injured human. Then, the problem of respiration detection can be described as the problem of separating the source signals (respiration and platform motion) from the mixed observation signals of multiple range channels. To solve this problem, we proposed a signal processing method based on independent component analysis (ICA) to separate the respiratory signal. It can correctly separate the respiration of a ground human target from the noisy signals, without any supervision and prior information of platform motion. Meanwhile, it has strong noise tolerance and environmental adaptability.
The rest of this paper is organized as follows: Section 2 describes the UAV-mounted UWB radar system; Section 3 derives the respiratory signal model with platform motion interference; Section 4 introduces the proposed signal processing method; Section Section 5 presents and discusses the results of the experiments; Section 6 draws conclusions.

2. UAV-mounted UWB Radar System

In order to obtain the vital signal of a ground injured human target, this study builds a data collection system based on a UAV carrying a UWB radar sensor. It is shown in Figure 1. It mainly includes four modules: (1) a quad-copter UAV as the carrying and moving platform; (2) a Xethru X4m200 UWB radar sensor (Novelda, Norway) for acquiring the vital signal of an injured human target, and the key parameters of radar are presented in Table 1; (3) a communication model for GPS location and radar data transmission, which is composed of a Stm32 micro-controller and LoRa chip; (4) the power module for power support to the radar and communication module.
The workflow in one mission execution of this system is shown in Figure 2. After receiving the GPS location of the suspected target from a search UAV based on high-definition camera [13], this system equipped with UWB radar flies to the position and hovers at a low altitude to collect vital signals and transmit these data to the ground station. Then, the ground station processes the data using the proposed respiratory signal extraction method in order to re-identify and distinguish the life state of the suspected target.

3. Signal Model

UWB radar detects the range of a human subject via transmitting pulses and observing the time delay of the echo signal. The time delay depends on the variable distance from radar to subject. The changing distance caused by respiration and heartbeat can be expressed by [17]
d ( t ) = d 0 + A r s i n ( 2 π f r t ) + A h s i n ( 2 π f h t )
where d 0 is the distance from the antenna to the chest, A r and f r are the amplitude and frequency of respiration and A h and f h are the amplitude and frequency of heartbeat.
For vital signal detection in UAV-mounted UWB radar, heartbeat is much smaller than respiration and interference. We only take the respiratory signal as the basis for distinguishing the life state of the wounded subject. Additionally, the UAV platform random motion leads to an additional interference component. Then, the changing distance can be expressed by
d ( t ) = d 0 + X ( t ) + d U A V ( t ) = d 0 + A r s i n ( 2 π f r t ) + d U A V ( t )
where X ( t ) is the respiratory signal and d U A V ( t ) is the platform movement.
With a single human subject and stationary background environment, the radar system response is [18]
h ( τ , t ) = a r δ ( τ τ r ( t ) ) + i a i δ ( τ τ i )
where a r δ ( τ τ r ( t ) ) is the response of the human target and i a i δ ( τ τ i ) is the response of other static clutter.
In the scenario of outdoor injured people respiration detection using UAV-mounted radar, the static object refers to the ground. Due to platform movement, the static object no longer behaves as static, thus the response of radar can be further expressed as
h ( τ , t ) = a r δ ( τ τ r ( t ) ) + a g δ ( τ τ g ( t ) )
Time delay τ r ( t ) is given by
τ r ( t ) = 2 d ( t ) c
and time delay τ g ( t ) is given by
τ g ( t ) = 2 d 1 ( t ) c
where c = 3 × 10 8   m/s is the speed of light, d 1 ( t ) = d 0 + d U A V ( t ) is the changing distance from the antenna to the ground. a r δ ( τ τ r ( t ) ) is the response of the human target and a g δ ( τ τ g ( t ) ) is the response of the static ground that appears as the dynamic clutter changing with time in the radar echo signal and reflecting the platform motion signal.
The received signal can be denoted by [19]
R ( τ , t ) = s ( τ ) h ( τ , t ) = a r s ( τ τ r ( t ) ) + a g s ( τ τ g ( t ) )
where s ( τ ) is the transmitted pulse. The two-dimension raw echo data in the form of discrete are denoted by
R [ m , n ] = a r s ( m δ T τ r ( n T s ) ) + a g s ( m δ T τ g ( n T s ) ) = r [ m , n ] + g [ m , n ]
where m = { 1 , 2 , , M } is the number of fast time samples representing the range points, and n = { 1 , 2 , , N } is the number of slow time samples representing the time points. r [ m , n ] represents the interferenced respiratory signal and g [ m , n ] represents the echo signal of the ground.
In real scenario detection, the received raw data can be expressed as
R [ m , n ] = r [ m , n ] + g [ m , n ] + z [ m , n ]
where z [ m , n ] is unavoidable noise interference.
Adopting a complex exponent to represent the transmitted signal, the echo signal that carries the respiration component in Equation (7) can then be expressed as [20]
r ( t ) = A r e j ( 2 π f c t + θ ( t ) + φ 0 )
where f c is the central operating frequency, φ 0   is the initial phase and θ ( t ) is the phase shift caused by human respiration and platform motion. The Doppler shift is defined as
f d ( t ) = 2 V ( t ) λ = 2 d ( t ) λ d t = 1 2 π d ( θ ( t ) ) d t
where λ is the wavelength of radar transmitted signal; then the respiratory signal of a human subject with platform motion can be approximately expressed by
θ ( t ) = 4 π λ ( X ( t ) + d U A V ( t ) ) = 4 π λ X ( t ) + 4 π λ d U A V ( t )
Equation (12) indicates that platform movement will hide the human respiratory signal of interest, destroy the periodicity of respiration and decrease the detectability of human target. This is the reason why traditional detection methods fail. Equations (7), (8) and (12) show that echo signals of multiple range channels from human target to ground are a linear mixture of the respiratory signal and the platform movement signal. Our task in this study is to accurately extract the respiration from mixed observed signals, i.e., the first term in Equation (12).

4. Signal Processing

Major steps of the proposed signal processing approach are shown in Figure 3, which illustrates the process of human respiration extraction in the outdoor injured people scenario using UWB radar. The signal processing approach is divided into five blocks, including range migration compensation, observed signals extraction, pre-processing, ICA and respiratory signal extraction.

4.1. Range Migration Compensation

Due to GPS communication quality or other external environment interference, the UAV platform will inevitably generate relative motion even in hovering mode. In this study, the platform movement that is less than the range resolution of radar is defined as tiny platform motion, and the platform movement that is greater than the range resolution of radar is defined as great platform motion. The great platform motion will cause the problem of range migration (illuminated in Figure 4a), which refers to the fact that at two different time points, the target is located in different range bins of radar echo data. In order to accurately select the range bin where the human target is located, it is necessary to correct the range migration caused by great platform motion.
For removing range migration, the cross correlation between the first frame and each subsequent frame of raw two-dimensional matrix is calculated [21]. The difference between two frames means that the human subject is located in different range bins at these two time points, and the position of the maximum cross correlation value indicates then lead or retardation range unit. Then, each frame is circularly shifted on the basis of maximum cross correlation value to compensate for range migration [22]. As shown in Figure 4b, after range alignment, the target is located in the same range bin at any time point.

4.2. Observed Signals Extraction

According to Equation (8), radar sensing data can be expressed as a two-dimensional matrix R ; the aim of this step is to select the multichannel range bins [ y 1 ( t ) , y 2 ( t ) , , y a ( t ) ] T in which the human target and the ground are located, where a is the number of extracted signals. Assuming that there is only one subject in the detection zone and the subject keeps static, the range bin with the maximum energy will be the location of the people. The multiple range bin selection is completed using a range sampler, which chooses the maximum energy range bin along the slow-time as the centre [23]. In this study, the range length of interest for the range sampler was set as 25 cm to keep the selected signals covering the range bins from the entire thorax of the human target to the ground. The observed signals selected using the range sampler are input for ICA.

4.3. Pre-Processing

The extracted observed signals contain noise interference; in order to separate the clear and true respiration, it is necessary to apply pre-processing procedures. The pre-processing step includes direct current (DC) removal, low pass (LP) filtering and baseline removal [24].
The first step is to remove the DC component, and it can be implemented by
R D C ( m , a ) = R a w ( m , a ) 1 a n = 1 a R a w ( m , a )
where R D C ( m , a ) is the matrix after DC removal, m is the number of fast time samples and a is the number of the observed signals.
Then the Butter-worth LP filter with a cut-off frequency of 3 Hz is applied to remove the noise caused by the UAV blades’ echo at about 3.5 Hz and other high-frequency noise interference. The LP filter is determined as
R L P ( m , a ) = R D C ( m , a ) h ( t )
where R L P ( m , a ) is the matrix after the 3Hz LP filter.
The last step is to denoise the baseline by applying the BEADS algorithm, which is illustrated in [25]. The input signals of ICA are the observed signals after pre-processing.

4.4. Independent Component Analysis

4.4.1. ICA Compliance

ICA is a powerful method to blindly separate potential components and source signals from the mixed observation signals, and it has been successfully and widely used in many fields, such as separating brain activity from artefact, finding hidden factors in stock data and denoising for natural images [26]. In the problem of vital signal detection using a UAV-mounted radar, the source signals are respiration and the platform motion.
To apply ICA, the following three assumptions are given: the components are independent of each other; the independent components must be in non-Gaussian distribution; and the number of mixed observed signals is greater than or equal to the number of independent components [27]. In this section, the proof that the problem of respiratory signal extraction in this study satisfies the three assumptions to apply ICA is given.
The corresponding continuous-time signal of the interferenced respiratory signal component in Equation (9) is
r ( t ) = 4 π λ X ( t ) + 4 π λ d U A V ( t )
The corresponding continuous-time signal of the ground echo signal component in Equation (9) is
g ( t ) = 4 π λ d U A V ( t )
From Equations (15) and (16), it can be concluded that the observed signals are a linear combination of the respiration X ( t ) and UAV platform motion d U A V ( t ) . The information on the value of X ( t ) does not give any information on the value of d U A V ( t ) , and vice versa, the two variables are said to be independent [28]. Moreover, the respiratory signal is non-Gaussian, since in the signal model, the distribution of respiration depends on a sinusoidal signal. The motion of the UAV platform depends on many random factors, such as the quality of satellite communications and the wind external environment, so it is usually a non-Gaussian signal. In fact, if only one of the independent components is Gaussian, ICA is also achievable [28]. Finally, according to the range sampler described in Section 4.2, the number of the observed signals is greater than or equal to the number of independent components. Therefore, the problem in this study satisfies the three conditions to apply ICA.

4.4.2. Process of ICA

The aim of this step is to reduce the dimension of data and separate respiratory signal using blind source separation. In ICA model, the observed signals are described as the linear combination of source signals, including respiratory signal and platform motion signal. Therefore, the observation of each extracted range bin can be written as [29]:
y 1 ( t ) = a 11 s 1 ( t ) + a 12 s 2 ( t ) + + a 1 n s n ( t ) y 2 ( t ) = a 21 s 1 ( t ) + a 22 s 2 ( t ) + + a 2 n s n ( t ) y a ( t ) = a n 1 s 1 ( t ) + a n 2 s 2 ( t ) + + a n n s n ( t )
where y 1 ( t ) ,   y 2 ( t ) , , y a ( t ) are the mixed observed signals, s 1 ( t ) ,   s 2 ( t ) , , s n ( t ) are the unknown source signals (i.e., independent components) and a i j is the weight of each source signal. Matrix representation of equation (17) is given as [30]:
Y = A S + E
where A R a × n is the mixing matrix, Y R a × m is the matrix of the observed signals, S R n × m is the independent component matrix and E is the residual matrix, and these matrixes can be expressed as:
A = [ a 11   a 12     a 12 a 21   a 12     a 12               a 11   a 12     a 12 ] ,   Y = [ y 1 ( t ) y 2 ( t ) y a ( t ) ] ,   S = [ s 1 ( t ) s 2 ( t ) s n ( t ) ]
Since the matrixes A and S are unknown, and the matrix Y can be observed, we need to use Y to estimate A and S . The fundamental of ICA is to estimate mixing matrix A so that the reconstructed components are as independent as possible. The inverse matrix of A is denoted by W , then, the independent components can be calculated as
S = W Y
Here are several existing ICA algorithms to estimate the de-mixing matrix W . In this paper, we use FastICA to apply blind signal separation on account of its very fast convergence and ease of operation. FastICA is based on a fixed-point algorithm and the first step of pre-processing is to centre the input data, which could simplify the ICA algorithm. This process is achieved by subtracting its mean vector [28].
Y ^ = Y E ( Y )
The second step of pre-processing is to whiten the input data matrix Y ^ and obtain a new white matrix Y ˜ , which aims at eliminating the correlation among the observed signals. Therefore, Y ˜ satisfies E ( Y ˜ Y ˜ T ) = I . Here, the eigen-value decomposition (EVD) is used to whiten data. After EVD, the covariance matrix of Y ˜ is expressed by
E ( Y ˜ Y ˜ T ) = U Λ U T
where U and Λ are the eigenvector matrix and diagonal matrix composed of eigenvalues. The detailed process is summarized in Algorithm 1, where the function g ( · ) used in the algorithm is [31]
g ( u ) = u exp ( u 2 / 2 )
Algorithm 1 FastICA
1: Input the observed signals.
2: Centre the data to give Y ^ .
3: Whiten the data to give Y ˜ .
4: Choose the number of independent components m.
5: For p = 1   to   m
6: Initialize the weight vector   w p .
7:  w p E { Y ˜ g ( w p T Y ˜ ) } E { g ( w p T Y ˜ ) }
8:  w p w p i = 1 m 1 ( w p T w i ) w i
9:  w p w p / w p
10: If w p is not converged, go back to step 7.
11: End for
12:  W = [ w 1 , w 2 , ,   w m ]

4.5. Respiratory Signal Extraction

Auto-correlation presents the correlation strength of a signal at different time lag. It is defined as
R ( τ ) = r ( t ) r ( t + τ ) d t
where r ( t ) denotes respiratory signal and τ denotes time delay. For periodic signals, the auto-correlation coefficient also changes periodically [32]. The respiratory signal of humans is a regular periodic signal with good time correlation, and other independent components (ICs) describe the movement of radar and interference; they are random signals. In this case, the respiratory signal extraction method is based on the comparison of auto-correlation between ICs; then the wanted IC that represents the vital signal of injured people can be effectively extracted.

5. Experiments

5.1. Experimental Setup

The setup of the UAV-mounted UWB radar system is shown in Figure 1. The Z410 quad-rotor UAV carries the X4m200 UWB radar for human vital signal detection and life state discrimination. In the 30-s detection process, the subject lies supine in the detection area and remains stationary all the time to simulate the injured human target, and the detection system hovers directly above the subject and remains at an altitude of 2 m. Meanwhile, an ErgoLAB wireless respiration belt is connected to the subject to collect the reference respiratory signal.
In this study, two typical scenarios (shown in Figure 5) were considered separately to evaluate the reliability and the noise tolerance of the proposed method for respiration detection. The first experiment was conducted in an outdoor scenario of relatively smooth ground as shown in Figure 5a with one female (subject 1: 168 cm, 55 kg) and two male (subject 2: 180 cm, 60 kg and subject 3: 175 cm, 70 kg) subjects. In scenario 1, the interference was mainly from UAV platform motion. The second experiment was conducted in an outdoor scenario of grassland as shown in Figure 5b with the same three subjects. Compared with scenario 1, scenario 2 adds the interference of grass movement, which is more complex and closer to the real SAR environment.

5.2. Results and Discussion

In this section, the results of major steps of signal processing including observed signals extraction and respiration detection in scenario 1 and scenario 2 are presented, and the performance of the proposed method is compared with the background residual (BGR) method, which is the existing effective method [15]. Here, two indicators are defined to evaluate the accuracy of respiratory rate (RR) estimation and noise tolerance of these methods, including accuracy and signal-to-noise ratio (SNR). The accuracy of RR is defined as
a c c u r a c y = | e s t i m a t e d   f r e q u e n c y r e f e r e n c e   f r e q u e n c y | r e f e r e n c e   f r e q u e n c y × 100 %
Considering that we only focus on the RR and it is usually about 0 Hz–2 Hz, SNR between 0 Hz–2 Hz is defined as
S N R = P R P n = | S ( k r ) | 2 1 N k 1 ( k = 0 k r 1 | S ( k ) | 2 + k = k r + 1 N k | S ( k ) | 2 )  
where P R and P n are the respiration power and noise power, N k is the number of sampling points between 0 Hz–2 Hz, S ( k ) is the frequency component signal and k r is the sample point index of respiratory signal.

5.2.1. Observed Signals Extraction

Raw radar data are first aligned along the range points, and then we proceed to the observed signals extraction step. In order to more intuitively display the effect of range migration compensation, here we selected the radar data with a detection time of 5 min, and the result is shown in Figure 6. Figure 6a is raw radar data, and the data collection system is about 2 m from the subject, from which we can see that during the detection process, there is a problem that the target is located in different range bins. Figure 6b is the radar data processed by the range migration compensation, in which case the target is in the same range bin. The result shows that on the basis of retaining the complete signal of interest, this method can effectively eliminate the movement of the UAV platform across the range bin and ensure the accuracy of subsequent observed signal extraction.
The observed signals extracted from the radar data after range alignment by the range sampler are shown in Figure 7, and they are mixed signals of chest movement and radar movement from multiple range channels. After pre-processing, the observed signals are taken as the input of ICA to blindly estimate respiration of the human target.

5.2.2. Respiration Detection in Scenario 1

The first experiment was conducted with three subjects; here the results of subject 1 are visualized in Figure 8 and Figure 9, and the results of three subjects are listed in Table 2. In the experiment, the signal measured by respiratory belt was considered the ground truth signal. Figure 8b and Figure 9b show the time domain and frequency domain of reference respiration, from which we ascertain that the reference RR is around 0.3418 Hz. According to the maximum energy method, the raw slow-time echo signal of human target can be obtained, as shown in Figure 8a, and the corresponding frequency spectrum is shown in Figure 9a. It shows that we are unable to observe the clear respiratory component because it was submerged by the interference from UAV motion. Thus, we must compensate the platform motion to achieve accurate results.
After applying each process of the method described in Section 4, the clear respiratory signal can be automatically extracted from the mixed signal as shown in Figure 8c. In its frequency spectrum as shown in Figure 9c, the maximum peak appears at 0.3652 Hz, which is the detected RR of the subject. The results of the background residual method are indicated in Figure 8d and Figure 9d, and the detected RR is also 0.3652 Hz. Table 2 displays the metrics comparison between the reference method and the proposed method in this work. It shows that in the detection area with relatively smooth ground, both of the two methods can separate respiratory signal from platform motion interference, but this proposed method has a better effect in the aspects of reserving the details and SNR.

5.2.3. Respiration Detection in Scenario 2

The results of the second experiment conducted in scenario 2 are shown in Figure 10 and Figure 11 and Table 3. The reference RR and RR extracted using our method are equal to 0.4329 Hz and 0.4482 Hz, respectively. Figure 10d and Figure 11d suggest that the background residual method loses efficacy in the scenario with external environmental interference, which is based on the theory that the platform motion can be separated by the signal return from static ground in the range profile. When there is grass movement caused by UAV rotor rotation, the echo signal of ground no longer represents the platform motion; instead, the grass motion will be introduced into the detection results. This is the reason why the background residual method fails in scenario 2. Our method estimates the respiratory signal based on the non-Gaussian qualities and independence of the source signals, and the results does not depend on the quality of the obtained platform motion signal, so it is also effective in the scenario with external environmental interference.
According to visual inspection and quantitative evaluation, this proposed method has a better effect in the aspects of reserving the details and SNR, making it more suitable for practical application. In terms of operability, our method does not need to manually extract the range bin where the ground echo signal is located, and it can better meet the requirements of real-time respiration detection of post-disaster ground injured people.

6. Conclusions

For addressing the issue of re-identifying the suspected targets and distinguishing the life state of injured people, a vital signal detection technology based on UAV-mounted UWB radar was developed. According to the mixing characteristic of source signals, the problem of respiration detection is described as the problem of separating the source signals (respiration and the platform motion) from the observed signals. Then, a signal processing method based on ICA was proposed to extract respiration from the collected data. Real-world field experiments with two different scenarios confirm that our proposed method could effectively estimate accurate respiration without any monitoring and prior information of UAV platform motion, granting it stronger noise tolerance and environmental adaptability compared to the existing background residual method. Meanwhile, this method can be extended to other unstable radar carrying platforms for vital signal detection, which is very important for formulation of a rescue plan in a SAR mission.
However, in this experiment, compared with the actual detection environment, the experiment scenario is not complex enough. In the follow-up study, we will consider more chaotic and complex conditions such as detecting vital signals of a buried survivor. In addition, in this paper, we only take the respiratory signal as the vital signal of a human subject; our future study will focus on heartbeat signal extraction using this technology.

Author Contributions

Conceptualization, J.W. and G.L.; methodology, Y.J. and Z.L.; software, Y.C. and T.L.; investigation, F.Q. and J.X.; data curation, F.Y. and M.Z.; writing—original draft preparation, Y.J.; writing—review and editing, F.Q.; visualization, Y.C.; funding acquisition, G.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key Research and Development Program of Shaanxi (2021ZDLGY09-07 and 2022SF-482). The APC was funded by 2021ZDLGY09-07.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Berawi, M.A.; Leviäkangas, P.; Siahaan, S.A.O.; Hafidza, A.; Sari, M.; Miraj, P.; Harwahyu, R.; Saroji, G. Increasing Disaster Victim Survival Rate: SaveMyLife Mobile Application Development. Int. J. Disaster Risk Reduct. 2021, 60, 102290. [Google Scholar] [CrossRef]
  2. Farahani, R.Z.; Lotfi, M.; Baghaian, A.; Ruiz, R.; Rezapour, S. Mass Casualty Management in Disaster Scene: A Systematic Review of OR&MS Research in Humanitarian Operations. Eur. J. Oper. Res. 2020, 287, 787–819. [Google Scholar]
  3. Qi, J.; Song, D.; Shang, H.; Wang, N.; Hua, C.; Wu, C.; Qi, X.; Han, J. Search and Rescue Rotary-wing UAV and its Application to the Lushan ms 7.0 Earthquake. J. Field Robot. 2016, 33, 290–321. [Google Scholar] [CrossRef]
  4. Tian, Y.; Liu, K.; Ok, K.; Tran, L.; Allen, D.; Roy, N.; How, J.P. Search and Rescue under the Forest Canopy Using Multiple UAVs. Int. J. Robot. Res. 2020, 39, 1201–1221. [Google Scholar] [CrossRef]
  5. Mohsan, S.A.H.; Khan, M.A.; Noor, F.; Ullah, I.; Alsharif, M.H. Towards the Unmanned Aerial Vehicles (UAVs): A Comprehensive Review. Drones 2022, 6, 147. [Google Scholar] [CrossRef]
  6. Kucharczyk, M.; Hugenholtz, C.H. Remote Sensing of Natural Hazard-related Disasters with Small Drones: Global Trends, Biases, and Research Opportunities. Remote Sens. Environ. 2021, 264, 112577. [Google Scholar] [CrossRef]
  7. Pensieri, M.G.; Garau, M.; Barone, P.M. Drones as an Integral Part of Remote Sensing Technologies to Help Missing People. Drones 2020, 4, 15. [Google Scholar] [CrossRef]
  8. Kundid Vasić, M.; Papić, V. Improving the Model for Person Detection in Aerial Image Sequences Using the Displacement Vector: A Search and Rescue Scenario. Drones 2022, 6, 19. [Google Scholar] [CrossRef]
  9. Xing, L.; Fan, X.; Dong, Y.; Xiong, Z.; Xing, L.; Yang, Y.; Bai, H.; Zhou, C. Multi-UAV Cooperative System for Search and Rescue based on YOLOv5. Int. J. Disaster Risk Reduct. 2022, 76, 102972. [Google Scholar] [CrossRef]
  10. Schedl, D.C.; Kurmi, I.; Bimber, O. An Autonomous Drone for Search and Rescue in Forests using Airborne Optical Section-ing. Sci. Robot. 2021, 6, 1188. [Google Scholar] [CrossRef]
  11. Jiang, C.; Ren, H.; Ye, X.; Zhu, J.; Zeng, H.; Nan, Y.; Sun, M.; Ren, X.; Huo, H. Object Detection from UAV Thermal Infrared Images and Videos using YOLO Models. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102912. [Google Scholar] [CrossRef]
  12. Qi, F.; Zhu, M.; Li, Z.; Lei, T.; Xia, J.; Zhang, L.; Yan, Y.; Wang, J.; Lu, G. Automatic Air-to-Ground Recognition of Outdoor Injured Human Targets Based on UAV Bimodal Information: The Explore Study. Appl. Sci. 2022, 12, 3457. [Google Scholar] [CrossRef]
  13. Cao, Y.; Qi, F.; Jing, Y.; Zhu, M.; Lei, T.; Li, Z.; Xia, J.; Wang, J.; Lu, G. Mission Chain Driven Unmanned Aerial Vehicle Swarms Cooperation for the Search and Rescue of Outdoor Injured Human Targets. Drones 2022, 6, 138. [Google Scholar] [CrossRef]
  14. Islam, S.M.; Lubecke, L.C.; Grado, C.; Lubecke, V.M. An Adaptive Filter Technique for Platform Motion Compensation in Unmanned Aerial Vehicle based Remote Life Sensing Radar. In Proceedings of the 2020 50th European Microwave Conference (EuMC), Utrecht, The Netherlands, 12–14 January 2021; pp. 937–940. [Google Scholar]
  15. Rong, Y.; Herschfelt, A.; Holtom, J.; Bliss, D.W. Cardiac and Respiratory Sensing from a Hovering UAV Radar Platform. In Proceedings of the 2021 IEEE Statistical Signal Processing Workshop (SSP), Rio de Janeiro, Brazil, 11–14 July 2021; pp. 541–545. [Google Scholar]
  16. Rohman, B.P.; Andra, M.B.; Nishimoto, M. Through-the-wall Human Respiration Detection Using UWB Impulse Radar on Hovering Drone. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 6572–6584. [Google Scholar] [CrossRef]
  17. Wang, D.; Yoo, S.; Cho, S.H. Experimental comparison of IR-UWB radar and FMCW radar for vital signs. Sensors 2020, 20, 6695. [Google Scholar] [CrossRef]
  18. Liang, X.; Deng, J.; Zhang, H.; Gulliver, T.A. Ultra-wideband Impulse Radar Through-wall Detection of Vital Signs. Sci. Rep. 2018, 8, 1–21. [Google Scholar] [CrossRef] [PubMed]
  19. Liang, X.; Zhang, H.; Ye, S.; Fang, G.; Gulliver, T.A. Improved Denoising Method for Through-wall Vital Sign Detection Using UWB Impulse Radar. Digit. Signal Processing 2018, 74, 72–93. [Google Scholar] [CrossRef]
  20. Cardillo, E.; Li, C.; Caddemi, A. Vital Sign Detection and Radar Self-motion Cancellation Through Clutter Identification. IEEE Trans. Microw. Theory Tech. 2021, 69, 1932–1942. [Google Scholar] [CrossRef]
  21. Cardillo, E.; Li, C.; Caddemi, A. Empowering Blind People Mobility: A Millimeter-wave Radar Cane. In Proceedings of the 2020 IEEE International Workshop on Metrology for Industry 4.0 & IoT, Roma, Italy, 3–5 June 2020; pp. 213–217. [Google Scholar]
  22. Sharafi, A.; Baboli, M.; Eshghi, M.; Ahmadian, A. Respiration-rate Estimation of a Moving Target Using Impulse-based Ultra Wideband Radars. Australas. Phys. Eng. Sci. Med. 2012, 35, 31–39. [Google Scholar] [CrossRef]
  23. Xu, H.; Ebrahim, M.P.; Hasan, K.; Heydari, F.; Howley, P.; Yuce, M.R. Accurate Heart Rate and Respiration Rate Detection Based on a Higher-Order Harmonics Peak Selection Method Using Radar Non-Contact Sensors. Sensors 2021, 22, 83. [Google Scholar] [CrossRef]
  24. Ma, Y.; Wang, P.; Huang, W.; Qi, F.; Liang, F.; Lv, H.; Yu, X.; Wang, J.; Zhang, Y. A Robust Multi-feature based Method for Distinguishing between Humans and Pets to Ensure Signal Source in Vital Signs Monitoring Using UWB Radar. EURASIP J. Adv. Signal Processing 2021, 2021, 1–24. [Google Scholar] [CrossRef]
  25. Ning, X.; Selesnick, I.W.; Duval, L. Chromatogram Baseline Estimation and Denoising Using Sparsity (BEADS). Chemom. Intell. Lab. Syst. 2014, 139, 156–167. [Google Scholar] [CrossRef]
  26. Calhoun, V.D.; Liu, J.; Adalı, T. A Review of Group ICA for FMRI Data and ICA for Joint Inference of Imaging, Genetic, and ERP Data. Neuroimage 2009, 45, S163–S172. [Google Scholar] [CrossRef]
  27. Ren, W.; Qi, F.; Foroughian, F.; Kvelashvili, T.; Liu, Q.; Kilic, O.; Long, T.; Fathy, A.E. Vital Sign Detection in Any Orientation Using a Distributed Radar Network via Modified Independent Component Analysis. IEEE Trans. Microw. Theory Tech. 2021, 69, 4774–4790. [Google Scholar] [CrossRef]
  28. Hyvärinen, A.; Oja, E. Independent Component Analysis: Algorithms and Applications. Neural Netw. 2000, 13, 411–430. [Google Scholar] [CrossRef]
  29. Hiroe, A. In Solution of Permutation Problem in Frequency Domain ICA, using Multivariate Probability Density Functions. In Proceedings of the International Conference on Independent Component Analysis and Signal Separation, Charleston, SC, USA, 5–8 March 2006; pp. 601–608. [Google Scholar]
  30. Zhang, S.; Zhao, C. Hybrid Independent Component Analysis (H-ICA) with Simultaneous Analysis of High-order and Second-order Statistics for Industrial Process Monitoring. Chemom. Intell. Lab. Syst. 2019, 185, 47–58. [Google Scholar] [CrossRef]
  31. Oja, E.; Yuan, Z. The FastICA Algorithm Revisited: Convergence Analysis. IEEE Trans. Neural Netw. 2006, 6, 1370–1381. [Google Scholar] [CrossRef] [PubMed]
  32. Contin, A.; Pastore, S. Classification and Separation of Partial Discharge Signals by Means of Their Auto-correlation Function Evaluation. IEEE Trans. Dielectr. Electr. Insul. 2009, 16, 1609–1622. [Google Scholar] [CrossRef]
Figure 1. UAV-mounted UWB radar system for vital signal detection of ground injured human subject.
Figure 1. UAV-mounted UWB radar system for vital signal detection of ground injured human subject.
Drones 06 00235 g001
Figure 2. Workflow of the UAV-carried UWB radar system.
Figure 2. Workflow of the UAV-carried UWB radar system.
Drones 06 00235 g002
Figure 3. The block diagram of the radar signal processing.
Figure 3. The block diagram of the radar signal processing.
Drones 06 00235 g003
Figure 4. The problem of range migration. (a) Radar echo data with range migration. (b) Data after range migration compensation.
Figure 4. The problem of range migration. (a) Radar echo data with range migration. (b) Data after range migration compensation.
Drones 06 00235 g004
Figure 5. Illustration of experimental settings for two scenarios. (a) Scenario 1 with smooth background, (b) scenario 2 with grassland background.
Figure 5. Illustration of experimental settings for two scenarios. (a) Scenario 1 with smooth background, (b) scenario 2 with grassland background.
Drones 06 00235 g005
Figure 6. Range profiles of a static subject. (a) Radar data without range migration compensation. (b) Radar data with range migration compensation.
Figure 6. Range profiles of a static subject. (a) Radar data without range migration compensation. (b) Radar data with range migration compensation.
Drones 06 00235 g006
Figure 7. Observed signals extracted using the range sampler.
Figure 7. Observed signals extracted using the range sampler.
Drones 06 00235 g007
Figure 8. Results of subject 1 in scenario 1. (a) Raw radar echo signal of subject 1 obtained by maximum energy method. (b) Reference respiration from respiratory belt. (c) Respiration extracted using our proposed method. (d) Respiration extracted using background residual method.
Figure 8. Results of subject 1 in scenario 1. (a) Raw radar echo signal of subject 1 obtained by maximum energy method. (b) Reference respiration from respiratory belt. (c) Respiration extracted using our proposed method. (d) Respiration extracted using background residual method.
Drones 06 00235 g008
Figure 9. Frequency spectrum of the signals in Figure 8. (a) FFT of raw radar signal. (b) FFT of respiration from respiratory belt. (c) FFT of respiration extracted by our proposed method. (d) FFT of respiration extracted using background residual method.
Figure 9. Frequency spectrum of the signals in Figure 8. (a) FFT of raw radar signal. (b) FFT of respiration from respiratory belt. (c) FFT of respiration extracted by our proposed method. (d) FFT of respiration extracted using background residual method.
Drones 06 00235 g009aDrones 06 00235 g009b
Figure 10. Results of subject 1 in scenario 2. (a) Raw radar echo signal of subject 1 obtained using maximum energy method. (b) Reference respiration from respiratory belt. (c) Respiration extracted using our proposed method. (d) Respiration extracted using background residual method.
Figure 10. Results of subject 1 in scenario 2. (a) Raw radar echo signal of subject 1 obtained using maximum energy method. (b) Reference respiration from respiratory belt. (c) Respiration extracted using our proposed method. (d) Respiration extracted using background residual method.
Drones 06 00235 g010aDrones 06 00235 g010b
Figure 11. Frequency spectrum of the signals in Figure 10. (a) FFT of raw radar signal. (b) FFT of respiration from respiratory belt. (c) FFT of respiration extracted using our proposed method. (d) FFT of respiration extracted using background residual method.
Figure 11. Frequency spectrum of the signals in Figure 10. (a) FFT of raw radar signal. (b) FFT of respiration from respiratory belt. (c) FFT of respiration extracted using our proposed method. (d) FFT of respiration extracted using background residual method.
Drones 06 00235 g011aDrones 06 00235 g011b
Table 1. X4m200 UWB radar parameters.
Table 1. X4m200 UWB radar parameters.
ParametersValues
Centre frequency7.29 GHz
Bandwidth1.4 GHz
Detection range0.4–5 m
Range resolution0.0514 m
Frame rate17 Hz
Table 2. Results of respiratory rate in scenario 1. The data length was 30 s for each experiment.
Table 2. Results of respiratory rate in scenario 1. The data length was 30 s for each experiment.
Scenario 1RR (Hz)Accuracy (%)SNR (dB)
ReferenceOur MethodBGROur MethodBGROur MethodBGR
Subject 10.34180.36520.365293.1593.1515.8210.56
Subject 20.20320.21530.221094.0591.2416.1811.49
Subject 30.28890.28330.283398.0598.0515.2711.35
Table 3. Results of respiratory rate in scenario 2. The data length was 30 s for each experiment.
Table 3. Results of respiratory rate in scenario 2. The data length was 30 s for each experiment.
Scenario 2RR (Hz)Accuracy (%)SNR (dB)
ReferenceOur MethodBGROur MethodBGROur MethodBGR
Subject 10.43290.44820.126896.4729.296.695.48
Subject 20.29300.29880.368298.0274.337.385.95
Subject 30.25000.26560.401693.7639.36.6.744.92
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jing, Y.; Qi, F.; Yang, F.; Cao, Y.; Zhu, M.; Li, Z.; Lei, T.; Xia, J.; Wang, J.; Lu, G. Respiration Detection of Ground Injured Human Target Using UWB Radar Mounted on a Hovering UAV. Drones 2022, 6, 235. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6090235

AMA Style

Jing Y, Qi F, Yang F, Cao Y, Zhu M, Li Z, Lei T, Xia J, Wang J, Lu G. Respiration Detection of Ground Injured Human Target Using UWB Radar Mounted on a Hovering UAV. Drones. 2022; 6(9):235. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6090235

Chicago/Turabian Style

Jing, Yu, Fugui Qi, Fang Yang, Yusen Cao, Mingming Zhu, Zhao Li, Tao Lei, Juanjuan Xia, Jianqi Wang, and Guohua Lu. 2022. "Respiration Detection of Ground Injured Human Target Using UWB Radar Mounted on a Hovering UAV" Drones 6, no. 9: 235. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6090235

Article Metrics

Back to TopTop