Next Article in Journal
Quality Analysis of Tuberculosis Specimens Transported by Drones versus Ground Transportation
Next Article in Special Issue
Computing in the Sky: A Survey on Intelligent Ubiquitous Computing for UAV-Assisted 6G Networks and Industry 4.0/5.0
Previous Article in Journal
Multi-UAV Coverage through Two-Step Auction in Dynamic Environments
Previous Article in Special Issue
Deep Learning-Based Energy Optimization for Edge Device in UAV-Aided Communications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV Computing-Assisted Search and Rescue Mission Framework for Disaster and Harsh Environment Mitigation

1
Software Research Institute, Technological University of Shannon, N37HD68 Athlone, Ireland
2
Faculty of Engineering, IBB University, Ibb 70270, Yemen
3
Department of Operation of Road Transport and Car Service, North-Eastern Federal University, 677000 Yakutsk, Russia
4
Department of Transport and Technological Processes, Vladivostok State University of Economics and Service, 690014 Vladivostok, Russia
5
Computer Science and Engineering, IIIT Naya Raipur, Atal Nagar 493661, Chhattisgarh, India
6
Far Eastern Federal University (FEFU), 690922 Vladivostok, Russia
7
Department of Electrical Engineering, University of Tabuk, Tabuk 71491, Saudi Arabia
8
School of Computer and Technology, University of Science and Technology of China, Hefei 230026, China
9
Department of Electronics Engineering, Indian Institute of Technology (BHU), Varanasi 221005, Uttar Pradesh, India
10
Department of Electrical Engineering, Faculty of Engineering, University of Malaya, Kuala Lumpur 50603, Malaysia
11
Faculty of Biological & Physical Sciences, Tom Mboya University College, Homabay 40300, Kenya
*
Author to whom correspondence should be addressed.
Submission received: 30 April 2022 / Revised: 9 June 2022 / Accepted: 14 June 2022 / Published: 22 June 2022
(This article belongs to the Special Issue Drone Computing Enabling IoE)

Abstract

:
Disasters are crisis circumstances that put human life in jeopardy. During disasters, public communication infrastructure is particularly damaged, obstructing Search And Rescue (SAR) efforts, and it takes significant time and effort to re-establish functioning communication infrastructure. SAR is a critical component of mitigating human and environmental risks in disasters and harsh environments. As a result, there is an urgent need to construct communication networks swiftly to help SAR efforts exchange emergency data. UAV technology has the potential to provide key solutions to mitigate such disaster situations. UAVs can be used to provide an adaptable and reliable emergency communication backbone and to resolve major issues in disasters for SAR operations. In this paper, we evaluate the network performance of UAV-assisted intelligent edge computing to expedite SAR missions and functionality, as this technology can be deployed within a short time and can help to rescue most people during a disaster. We have considered network parameters such as delay, throughput, and traffic sent and received, as well as path loss for the proposed network. It is also demonstrated that with the proposed parameter optimization, network performance improves significantly, eventually leading to far more efficient SAR missions in disasters and harsh environments.

1. Introduction

Over the last few decades, many countries have suffered considerable casualties and economic losses because of the high frequency of miscellaneous natural disasters. Natural disasters have recently caused enormous damage, resulting in crisis circumstances that have jeopardized people’s lives. One of the fundamental reasons for such huge losess is the lack of an efficient disaster relief network management system to support rapid emergency information diffusion. Disaster management, Search and Rescue (SAR) missions, and health monitoring are critical applications that require the localization of objects with high precision, sometimes promptly. During and after disasters, central management systems and infrastructure, including communication Base Stations (BSs), can be destroyed, causing insufficient availability of essential resources and the delinquency of communication links. It may take a few days to many weeks to re-establish a functioning communication backbone for telecommunications and internet services to restart. Therefore, it is essential to engage emerging and upcoming technologies to restore emergency communication networks. Furthermore, during natural disasters, SAR must maintain contact with each other, the control center, and the victims; victims must broadcast their whereabouts and receive rescue information in a closed-loop system. With controllable mobility, low cost, and strong Line-of-Sight (LoS) links, Unmanned Aerial Vehicles (UAVs) are envisioned as essential to emergency information diffusion by acting as flying BSs to serve terrestrial users losing connection to ground BSs. The employment of UAVs in SAR missions significantly reduces effort, financial costs, and time spent, while possibly saving human lives. UAVs can operate as flying BSs to offer emergency wireless services in disaster-affected areas to tackle this challenge [1].
A collection of UAVs can be employed to identify events on city highways for emergency vehicle assistance, as depicted in [2]. UAVs provide valuable information to the rescue troops, speeding up the rescue operation. A robust routing mechanism ensures communication stability while sending the distress information. In [3], the authors introduced a revolutionary UAV route planning framework for emergency message delivery and gathering. While visiting access sites to send and collect emergency alerts, mobility and transmission power are maximized. Moreover, [4] proposed a UAV-assisted Wi-Fi network to aid- people working at relief centers in gathering surveillance data. The planned network intends to speed up rescue efforts and give up-to-date disaster information.
The capacity of UAVs to carry a variety of equipment and traverse lengthy periods speaks of their fast expansion into civilian sectors other than surveillance and aerial photography. UAVs equipped with electro-optical sensors, real-time processing modules, and advanced communication systems demonstrate a novel low-cost solution for enhancing government authorities’ and rescue agencies’ current capabilities in detecting and locating wounded and missing people during and after disasters. The authors of [5] presented a way of obtaining information about a devastated region following a natural disaster. The process entails taking photos in real-time, marking their position and altitude, and transferring the image with flight characteristics to a ground control station to create a three-dimensional danger map.
In disaster relief networks, only local information can be utilized in autonomous decision-making, which may degrade the performance attained by game-theoretic methods. Some existing works have resorted to graph theory to solve the dynamic resource allocation problem with tractable complexity. The authors showed how UAVs equipped with vision cameras might help with avalanche SAR operations [6]. The process took advantage of Machine Learning (ML) techniques. The images of the avalanche debris taken by the UAV were first analyzed with a pre-trained Conventional Neural Network (CNN) to extract discriminating characteristics. The authors of [7] described a human body detection method based on onboard sensors’ color and depth data. In addition, the authors have presented a computational model for monitoring several people using size invariant and rotating the point of view around the target.
Furthermore, [8] proposed a completely autonomous rescue UAV with onboard real-time person detection capability. The researchers used Deep Learning (DL) algorithms to recognize open water swimmers. However, the main challenges of channel allocation in stochastic and dynamic environments are twofold: (1) only local knowledge can be utilized to make self-organizing decisions and (2) the convergence speed of the algorithm should acclimate to the dynamic network topology. With these two issues, we aim to study the integration of UAU computing to further improve SAR performance with convergence speed in dynamic network in disaster areas by utilizing local information in decision-making.

1.1. Related Work

Recently, UAVs have been proven to be highly useful in a variety of applications, including smart agriculture [9], security, power line inspection, survey and mapping [10], surveillance [11], SAR [12,13,14], delivery [15], and disaster area coverage [16]. Moreover, UAVs play an interesting role in crop monitoring by enabling various activities that were previously available to on-site farmers only [9]. The auction-based technique allocates the most suitable UAV to a survivor in the UAV and SAR issue, based on the bidding value derived as the distance between a UAV and a survivor. There is a center for natural disasters where most survivors are situated, vital for our endeavor. As a result, SAR operations should be organized with a greater emphasis on the center, with decreasing priority as the distance from this point increases.
UAVs can be deployed quickly, and their coverage can be dynamically modified in a disaster emergency communications system. They offer network assistance for disaster site SAR quickly and effectively, and real-time information from the disaster scene may be sent back to the SAR to help better rescue individuals who have been impacted [17]. As a result, communication between the disaster relief command center and SAR is critical throughout the disaster relief process, and a disaster frequently destroys the commercial network in the disaster area. Furthermore, a UAV network can also be placed at the epicenter of a disaster site to monitor the disaster zone, allowing SAR to seek and track people or animals in distress.
AI plays a vital role in improving the performance of robots including UAV in many applications [18,19,20]. The authors of [21] introduce the air learning open-source simulator environment for UAVs using Deep Reinforcement Learning (DRL) techniques. Federated Learning (FL) was used for improving UAV swarm computing scheduling and power allocation without exchanging sensitive raw data [22]. FL has been used to enable privacy preservation for internet-of-vehicles applications such as parking and traffic prediction [23]. The authors of [24] discussed decentralized FL architecture over UAV networks. The authors of [25] presented a novel and high-performing FL technique for edge-aided UAV networks, which utilized edge servers in base stations as intermediate aggregators with generally shared data. For privacy and high-level security, the authors of [26] introduced blockchains for securing data sharing in B5G for UAV computing networks. Reference [27] highlighted blockchain-enabled FL in UAV edge computing networks. Furthermore, the authors of [28] discussed the combination of blockchain and FL for UAV edge intelligence in smart environments. Table 1 provides a summary of the most related work included in this paper.

1.2. Motivation

Due to various factors, SAR relies on human rescuers, which may not be sufficient in many circumstances. To begin with, the injured people are frequently not precisely found. The number of these people and their present medical condition and necessary medical gear for treatment are unknown to the rescue crew. Covering a large impact area, on the other hand, takes more time for human rescuers, especially in areas with bad weather and uneven ground situations. Poisonous gases, flames, radiation, high temperatures, biohazards, and other risks in the impacted region also pose a significant threat to rescuers.
UAVs are distinguished from conventional vehicles by their cost-effectiveness, ease of use and maintenance, affordability, market availability, and ease and speed of entry to disaster areas [30] The operation of manned aerial vehicles demands highly skilled individuals, and their usage involves costly preventative and corrective maintenance to maintain the aerial vehicle and pilot’s safety. Furthermore, take-off and landing locations, typically located far from the SAR region, are subject to rigorous limitations. As a result, the employment of ground vehicles in SAR missions has been advocated [31]. However, the size, weight, and cost of batteries used to power ground vehicles contribute to the overall cost. In addition, ground vehicles have limited mobility and have trouble overcoming obstacles and obstructions. As a result, ground robots are not guaranteed to go outside cities. They also necessitates strict real-time control by the operator, as a tiny delay in control command transmission causes the robot to wander, potentially causing damage to the robot or its surroundings. In addition, real-time video of the robot’s surroundings should be sent to the human operator, and this is a bottleneck in bandwidth and transmission power.
The employment of UAVs for medical missions has been proven to be effective. Many studies have introduced the use of UAVs to deliver medication to rural areas and transport medical products over long distances, such as blood derivatives and pharmaceuticals [30], microbiological specimens and biological samples [32], and defibrillators to out-of-hospital heart attack victims [33]. The employment of UAVs outfitted with electro-optical sensors and processing modules might assist in overcoming the present weaknesses of traditional man-dependent systems and those of UAVs and ground robots. Furthermore, UAVs can play a leading role in providing real-time, on-demand, and trustworthy information, which is crucial in rescue and search operations due to the speed of data collection and processing.
Human detection using AI approaches, on the other hand, has been the subject of multiple studies. The Support Vector Machine classifier (SVM) was utilized to obtain the highest performance combining both feature descriptors. Reference [34] proposed using a Convolutional Neural Network (CNN)-based system to recognize and locate human positions inside surveillance camera material. In [35], the authors propose using a combination of the intensity histogram local binary pattern to distinguish between human and animal detection. The proposed approach in [35] uses a deep CNN to construct a quick deep learning classification system.

1.3. Contributions

This paper presents numerous UAV computing paradigms to expedite solving the complex problems of SAR by providing dedicated missions, intelligent methods, and functionalities during a disaster event within a short span of time, as deployment times can be mission critical. Because UAV computing intelligence can capture high-resolution images and allow data scientists to determine an unprecedentedly critical situation, using UAVs with smart devices can help reduce precious search time. Furthermore, the available onboard computing process via AI can help UAVs take action in real-time and guide SAR effectively and efficiently. The contributions of this paper are:
1.
We introduce UAV computing to expedite SAR missions, and functionalities to mitigate the impacts of disasters;
2.
We present the proposed network framework, which includes UAV computing, SAR, and disaster centers;
3.
We evaluate the proposed framework networks’ performance based on delay, throughput, load, traffic sent and received, and path loss from UAVs to SAR at different distances.
The organization of the paper is shown in Figure 1. Section 2 provides description of the UAV=enabled disaster management strategies. UAV computing is presented in Section 3. In Section 4, the proposed network architecture is illustrated. Section 5 describes the simulation experiment setup on OPNET 14.5. Section 6 presents the evaluation of the proposed network and the results are discussed in Section 7. Finally, the conclusions are discussed in Section 8.

2. UAV-Enabled Disaster Management Strategies

UAVs can change the way disaster mitigation is planned and prepared for, and how disasters are responded to and rebuilt from. To successfully respond to disasters and design a practical disaster management and prediction technique, it is critical to understand the nature of disasters and suitable strategies, as shown in Figure 2. UAVs can be used for different purposes while planning disaster management strategies such as mitigation, preparation, response, and recovery. In recent decades, the idea of disaster strategies have been used to define, plan, and analyze disasters and their respective emergency operations [36]. Thus, the most typical scope of disaster management strategy is the ongoing process of catastrophe management known as the disaster management cycle [36]. During a natural disaster, response time to disaster hazards is critical in preserving human lives, particularly those who live or work in the impacted regions.
UAVs can aid SAR in disasters by enhancing the procedure of data gathering from the disaster environment via integrated sensors connected to UAV computing. The disaster environment catches individuals alive in the catastrophe region, while the UAV computing helps gather mission critical data for more effective and well-timed SAR operations [37]. The data collected is processed and analyzed by the UAV computing itself using intelligent techniques for onward transmission. The SAR can now be coordinated and guided efficiently and safely based on the data analysis. This project employed UAV computing to detect disasters and devise expedited recovery measures. Furthermore, the UAV computing process flow is designed to gather data and to simultaneously transmit the information.
During disasters, UAV computing plays a critical role in capturing data in real-time and reaching disaster areas, reducing economic losses and losses of life, both humans and animal, by providing onboard processing, storage, and network connectivity for disaster mitigation, with minimum UAV energy consumption [38]. In addition, UAVs’ ability to fly closer to the disaster locations and surroundings with IoT devices and SAR makes them a critical disaster management choice. Yazid et al. [29] introduced UAVs to enable mobile computing with intelligence in several applications.
During a disaster, collaborative UAV computing can be used to fly closer to SAR teams to provide efficient connectivity, detect things in the disaster zone, gather accurate data from the disaster locations, compute over short distances, save energy, and send data to SAR and the cloud. On the other hand, UAV computing partnerships are built on service delivery for delivering SAR to catastrophe zones, and to locate jobs in the surroundings. Understanding of tasks and contexts is dependent on UAV computing and collaborative processing. On the other hand, actual performance depends on the efficiency of the processing tools available in the UAV for real-time computing and processing.
Artificial intelligence plays a vital role in improving computing in UAVs [39]. The UAV computing network collaborates to guide SAR and disaster centers using the collaboration concept for networked UAV computing intelligence collaboration, SAR, and disaster centers (nodes). The collaboration mode described involves UAV computing intelligence (e.g., UAV computing to UAV computing) or SAR to SAR, and vertical networking between UAV computing intelligence to SAR or to the disaster center. As a result, multi-UAV computing intelligence is networked and collaborated to allow the SAR team to reach disaster areas through a routine safety protocol [40]. For instance, UAV computing intelligence monitors SAR and communicates to detect disasters and identify the victims, alerting SAR using vertical networking between UAV computing intelligence and SAR.

3. UAV Computing

Each UAV is equipped with a microcontroller AI-capable processor board and wireless transceivers as a minimum design requirement [40]. UAVs are fitted with AI-capable processor boards with controllers that process commands and allow them to be controlled remotely using a remote control if desired. In addition, UAVs are equipped with wireless transceivers that allow them to communicate with one another and other objects on the ground, such as the center of the disaster and the command and control center. An ad-hoc UAV network is formed when two or more UAVs communicate to accomplish a single goal. The advancements in the ad-hoc network among UAVs meet the two primary criteria of UAV systems: (1) mobility and (2) adaptable topology of the network.
The ad-hoc network allows packets of data to be transferred instantly (in real-time) across networked UAVs [40,41,42,43]. As a result, the shared packets of data are routed across the networked UAVs along a path determined by the routing protocol. According to [40], there are two basic routing protocols: proactive routing protocols and reactive routing protocols. The total available network bandwidth will be reduced due to the large number of packets that are frequently transferred across the network to maintain connectivity. Furthermore, as packets need to be transferred between UAVs, the shortest pathways need to be known in reactive routing. As a result, reactive routing takes longer than proactive routing, as it requires knowing pathways ahead of time and has a shorter End-to-End (E2E) delay when transferring packets. Therefore, the trade-off between bandwidth and delay should be considered when selecting routing protocol hierarchies [44]. One sub-group of UAVs may be constructed just for long-range communication services to interface with external networks. In contrast, another sub-group of UAVs may be developed solely for sensing and monitoring [45], and may carry specialized sensors, actuators, and cameras. All nodes will exchange packets with UAVs and center units if necessary [44].

3.1. SAR Missions

The primary objective of SAR operations is to locate and rescue persons who may be lost or trapped with injuries due to a natural catastrophe. The first 72 h following a disaster are essential [36], since SAR efforts are at their peak and rescue personnel must safely locate and help survivors in disaster scenarios. As a result, SAR’s purpose is to assist in taking rapid steps to save lives.
Different technologies, including WSN, autonomous UAVs, satellite observations, data processing, and social networks, must be deployed at the same time to boost the effectiveness of SAR operations [36]. As a result, UAVs and fogs can extensively assist SAR operations since they can give quick and real-time data about the surrounding area. In addition, in catastrophe circumstances where continuous updates are required or if rescue crews cannot reach the target area readily and safely owing to debris or other barriers, UAVs and fog-connected WSNs are deployed.
UAVs are an excellent surveillance tool since they can fly over the intended location and communicate information to rescue personnel, such as photographs and video for a specified target. In this case, UAV-2-station, UAV-2-UAV, and UAV-2-fog networks can help maintain the connection of UAVs, stations, and fog nodes. However, the videos or photos captured by UAVs differ significantly from images gathered on the ground, necessitating the need for assisted intelligent edge computing [46]. Therefore, this factor should be considered while developing an image processing system for UAVs used in SAR missions. Figure 3 illustrates how UAV computing helps to monitor disaster areas, take action in real-time, and efficiently save lives.

3.2. AI for UAV and SAR

During and after disasters, search and rescue activities necessitate a great deal of work and money. Locating injured and missing people quickly helps rescuers and medical professionals optimize their efforts. This has the potential to save lives and money. Emergent AI demonstrates real-time detection based on CNN. UAVs can investigate collected series of photos and send back results in real-time by combining the high-performance detection and classification skills offered by emergent AI approaches with the exploration abilities of the UAVs. The advancement of AI-assisted UAVs allows for detecting injured and trapped people while flying, with the accurate and timely transmission of information to ground stations to guide rescuers and medical professionals to victims’ locations. The authors of [47] deliberated on how UAVs may enhance their processing capabilities by using emergent AI-based detectors. The proposed system can identify people in real-time, and the coordinates of such locations are communicated to the ground station. The authors of [48] implemented a pattern recognition approach to build a methodology for estimating UAV position. The correlation of SAR UAV and optical Quickbird satellite-segmented pictures was used to determine placement.
UAVs in natural disaster response are currently primitive at best, with a slew of barriers in the way. To build a framework that allows many UAVs to interact and adapt as mission conditions change, AI approaches are used. As a result, UAVs must be adaptable and capable of real-time environment sensing and online autonomous decision making, with AI providing superior decision-making. Furthermore, AI can increase SAR’s knowledge of natural hazards by analyzing vast amounts of data from various sources and allowing proactive rather than reactive disaster risk reduction activities. As a result, deploying AI to improve knowledge of all phases of disasters is a crucial goal, which may be accomplished by speeding up the development of SAR-safe algorithms. There are explainable AI techniques that provide futuristic UAV technology prospects that may be used to identify, analyze, and reduce disaster risks, and to provide better monitoring. To avoid, minimize, and manage all types of risks, there is a need to make informed and timely choices. The application of AI in the decision-making process has shown enormous potential.
Figure 4 depicts the artificial neural network model. An input layer of ni nodes, hidden levels of nm nodes, and an output layer of Y nodes make up the ANN. We have NI = 7 nodes in this paper. Furthermore, there are several input parameters to be taken into consideration.
x ¯ = ( x 1 , x 2 , , x s )
where x 1 , and x 2 are the parameters that should be considered for the artificial neural network. The artificial neural network output is Y. The artificial neural network with two hidden layer a and b neurons, respectively, can be expressed by:
Y o , r = Z O k = 1 a w 2 k Z 2 h j = 1 b w 1 j k Z 1 h i = 1 s w i n i j x i r
where x i r represents the i t h element of the r t h input parameter, and Z n h is the n t h layer activation function. The results of the testing step reveal whether or not the model is correct. If this is the case, it is ready to accept the cell information as input for the point of interest and the coordinates of the UAV computing for the required purpose.
The Mean Square Error ( M S E ) model is commonly used to optimize artificial neural network weights and training. This is demonstrated by
M S E = 1 D t r a i n p = 1 D t r a i n [ | d m Y o , r | ] 2 .
The performance of the procedure is validated using appropriate error indicators and the test set. The test set is selected randomly from the whole data set and is unrelated to the training set. The Mean Absolute Error ( M A E ), and the Root MSE ( R M S E ) will be used as performance metrics. The definitions of these indicators are given in the following expressions:
M A E = 1 D t e s t m = 1 D t e s t | d m Y o , r |
R M S E = 1 D t e s t m = 1 D t e s t [ d m Y o , r ] 2
where D t e s t is the test pattern number, d m is the m t h input pattern number, and Y o , m is the artificial neural network output.

4. Proposed Network Architecture

The fundamental paradigm behind the UAV computing-based surveillance system for SAR is to employ UAV computing systems for fast scanning the disaster regions with the help of smart devices (e.g., camera, human-computer interfaces, and integrated computing modules) and a model installed on the drones. This system helps pinpoint precisely where fast and early assistance is needed to serve people in a localized area. Figure 5 depicts an example of automated surveillance and search operations. In Figure 5, after identifying the position of individual SAR, the GPS location of humans may be given to the SAR team for a quick and efficient rescue. The recent success of explainable AI techniques for object identification and activity recognition has led us to use AI techniques in UAV computing surveillance. The use of an AI-leveraged system will provide fast identification and localization of outbreaks in forest areas or disaster areas, though the AI-enabled UAV surveillance system requires a large amount of data for training.
Furthermore, AI techniques employ the gathered data to train UAV models to perform various tasks, including object classification and localization of outbreaks in vulnerable areas. The integrated computing modules of image processing and computer vision enabled by AI techniques can automatically perform future extraction and representation for accurate classification of different objects in futuristic applications of drone studies. Moreover, the deep multimodal system with CNN architectures needs large datasets for fast computations, which can be obtained from sensor units, and can perform better calculations and analysis of objects. The deep learning-based computing models may be more suited for image-based feature extraction than any neural network utilized for classification and localization.
UAVs, SAR, the disaster center unit, and the ad-hoc network are all part of the proposed network design, as shown in Figure 2. Our provided network architecture is appropriate for all stages of a crisis. UAV-assisted SAR network capabilities will play a critical role in disaster preparedness. Sensors onboard the UAV, for example, are utilized to gather physical data in a disaster region and transmit data to the center unit of disaster management. Images of the impacted region are captured using the camera. As a result, data collected by a UAV fused with SAR were evaluated in this study to aid future SAR operations.
Our suggested UAV-assisted SAR network delivers precise and suitable disaster management and SAR team guidance based on disaster event updates during a crisis. Then, UAV computing significantly enhances the information available to the SAR team and improves safety, as well as helping SAR personnel do their jobs successfully and efficiently. The ad-hoc network aids in the reconnection and delivery of previously disconnected communication services.
A UAV-connected SAR mission is more efficient in supplying deliveries during the recovery period. A UAV equipped with the data acquired from the smart devices significantly improves the performance of a SAR mission. Furthermore, the information gathered may be utilized to assess damage and determine the need for rebuilding. Based on data collected by UAV IoT sensors, the ad-hoc network provides mission critical input to a central disaster unit located in a safe place.
The role relies mainly on the link between UAVs and the SAR team during a disaster event and recovery. An ad hoc network links heterogeneous nodes with UAVs and the catastrophe unit center for processing, as illustrated in Figure 5. UAVs and SAR team individuals in the impacted region and at the catastrophe center are examples of heterogeneous teams. The suggested cooperation tools help people plan for and recover from disasters. They meet numerous requirements, including real-time imaging, deployment, high-resolution picture output, energy efficiency, connection, and dependability.
UAV-2 is used to deliver power supplies and provide medication to areas that rescue teams cannot reach. The UAVs communicate with each other and SAR independently, while the disaster command and control center centralizes them. Figure 2 shows the framework of communication between IoT, SAR, UAV fog, and cloud. The smart wearable IoT device on the SAR body is connected to the SAR smartphone providing intra-SAR interaction. However, communication between SAR can be done by using inter-SAR interaction. The smartphone gathers sensing data from IoT devices on the SAR body in real-time and forwards the data to UAV fog for processing, making decisions, updating data in the cloud, and guiding SAR locally.

5. Simulation Experiment Setup on OPNET

OPNET is used to model and simulate complicated networks. It comes with a user-friendly graphical user interface, a drag-and-drop module import feature, visual effects, and network devices to create a customized network. In OPNET, each experiment pipeline has five steps, i.e., creating network scenario, statistics and configuration of the network, running the experiment, results, and discussion, as shown in Figure 6.
We chose the OPNET simulator modeler as the basis of the simulation platform for UAVs and SAR team communication network because of the performance efficiency, graphical user interface, model library, network visualization, document support, and external interface. On the OPNET simulation platform, we can develop UAV computing and SAR network simulation instances based on the study goals. Our proposed scenario is applied on the proposed network to mitigate the impacts of disaster and harsh environments. The simulation methodology’s overall flow is separated into three modeling domains, i.e., processing domain, node domain, and network domain. Figure 7 illustrates the components and network structure of the proposed experiment.

5.1. Node Domain

In node domains, we define UAV computing and SAR network scenarios with the SAR node, UAVs node, and disaster center unit.
SAR node: The SAR node is a protocol stack server with one FDM wireless interface designed as an OSI protocol stack server. The SAR node model is built as an 802.11 client that supports the entire OSI protocol stack. The application layer module can produce traffic flows and the network layer module can perform routing protocols. SAR devices in the same region can connect directly or remotely via multi-hop forwarding using the 802.11 interfaces. These SAR devices can use UAVs covering the region as a relay.
Disaster center: This is the server that can communicate with UAV-2 over a long-distance wireless FDM connection thanks to the directional antenna’s high gain.
UAV nodes: The UAV node with interfaces is developed as a router. It can provide high gain and a long broadcasting distance between UAVs and disaster center units. The 802.11 interface allows UAVs to communicate with each other. The multiple 802.11 interfaces can function without interference with different BSS-ID configurations.

5.2. Network Domain

In the network domain, we must build the network topology of the UAV network, which includes the number of UAVs, the kind of communication technology, attitude information (i.e., roll, pitch, and yaw), and geographic location information (i.e., altitude, latitude, and longitude,). Then, we introduce a three-level UAV computing, i.e., SAR network, ad-hoc network, and backbone network. The SAR network units gather data and share it with UAVs and disaster centers in the SAR network. In the ad-hoc network, UAVs are used to cover SAR and disaster areas. U A V 1 can provide services to the SAR and send information to U A V 2 , as shown in Figure 3. Finally, UAVs can provide backbone relay capability in the backbone network. U A V 2 communicates with low-altitude U A V 1 , with one TDMA interface for FANET communication between the UAVs and one FDM interface for relay connection with a distant ground station.

5.3. Processing Domain

Process models are primarily used to create communication protocols and algorithms that define the behavior of the node domain modules. Each node type has its own protocol configuration.
Physical: Important wireless model characteristics must be configured, such as data rate, bandwidth, modulation, base frequency, and antenna layout. Furthermore, a crucial aspect of configuring is the propagation model such as the free space model, HATA model, Longley–Rice model, etc.
Networks: The hierarchical properties of the UAV network must be considered while selecting and configuring the network layer routing protocol. Routing protocols must maintain connection in a multi-level hierarchical topology. Distinct levels can employ different routing protocols to decrease routing overhead and better use radio channel resources.
Application: The application traffic of the UAV network is asymmetric in many contexts. Therefore, we must assess the traffic volume and fluctuation characteristics and configure traffic at the application layer of each node.

5.4. Complexity Analysis

Assume that the UAV computing takes T c to complete task during SAR guidance in the disaster area; then, the UAV update time to transmit the captured data to SAR is T s . The aggregated model updates from UAV computing to the disaster center are transmitted in T d and therefore, the overall complexity of communication time in each task is OC (SN T s + U T d ). UAV communication time complexity is BI(SN ( T s + T d )), since the UAV function is the BS between the SAR and the disaster center. Because the number of active SARs S N is orders of magnitude greater than the number of UAV computing U, our network leads to a lower communication complexity than BS as MEC.

6. Evaluation Metrics of the Network Performance

Several parameters are considered to evaluate how UAV computing intelligence assists SAR and to examine the proposed network performance such as throughput, network load, delay, network traffic received and sent, and path losses. These parameters are considered for improving the system’s quality of service [49,50].

6.1. Data Traffic

The data traffic quantity carried by the network is referred to as the load [51]. It employs the most efficient network to alleviate congestion. However, because all available resources are overutilized and overworked, the network may experience severe congestion. The network load was computed by the developers by determining the ratio of data received to the maximum data fluctuating over the simulation period [51].The average time for packets to transit across the collaborative network from sender to receiver is the E2E latency.

6.2. Network Load

Network load N l is the amount of data received and the maximum varies depending on network demand [51].
N l = D a t a r B s B u ( D a t a r D a t a s )
D a t a r and D a t a s are data sent and received. B s and B u are buffer size unavailable. Then, the Equation (5) is expressed as:
N l = t 0 t f D a t a r ( t ) d t B s B u ( t 0 ) t 0 t f D a t a r ( t ) d t g D a t a s ( t ) d t
where f D a t a r ( t ) and g D a t a S ( t ) represent the sender and receiver time at rate t.

6.3. Throughput

The quantity of successful transmissions via a communication channel in a given unit of time is known as throughput. Any network’s goal is higher throughput. Average throughput is assessed over the network under the proposed architecture. In order to verify the consistency of the suggested technique, a throughput variation graph utilizing OPNET is also given [52]. Throughput T h refers to the data ratio of the received at the receiver node from the sender node during a given time [53].
T h = D P × P s × 8 T S M P
where D P , P s , and T S M P are delivered packets, packet size, and total simulation period, respectively.

6.4. Delay

The time for a signal to travel across the network from the UAV to SAR is called delay. Delay is a critical metric for assessing a communication network’s performance. The concept intends to reduce the travel time between the UAV and SAR by allowing direct contact between the two. The processing, queuing, and transmission delays of a network link are all included in E2E delays. The E2E delay D E n d 2 E n d depends on processing delay D p r o c e s s i n g , transmission delay D t r a n s m i s s i o n , and propagation delay D P r o p a g a t i o n , as well as the number of nodes N.
D E 2 E = N × ( D p r o c e s s i n g + D t r a n s m i s s i o n + D p r o p a g a t i o n )

7. Results and Discussion

In our simulation, we have employed the OPNET 14.5 simulator. We use OPNET as a simulation environment with the AODV protocol. Simulation time was 1000 s, speed was 128, update interval as 500,000 events, and number of network nodes was seven. Because of the autonomous nature of MANET technology, we recommend connecting the SAR and UAV computing intelligence to accomplish disaster management effectively and efficiently in a short time. The AODV routing protocol is used in the proposed network. The simulation takes 1000 s to complete. One UAV is travelling and moving closer to the activities.
Figure 8, Figure 9, Figure 10, Figure 11 and Figure 12 illustrate the performance of the UAV computing and SAR network in terms of network load, traffic data between sender and receiver, throughput, and E2E delay, respectively. In the figures, the x-axis refers to each node in the network, while the y-axis indicates the specific network load, traffic sent and received, throughput, and E2E delay. The simulation scenario was created in OPNET with an area of 1 × 1 km [52], with two UAVs, four SARs, and a disaster center unit (i.e., server). Table 2 illustrates the rest parameters and network configuration with the respective values. From our proposed scenario results, we observed that the desired delay, throughput, traffic sent, and received path loss and network load are met by the proposed method and significantly improve network performance.
This UAV acquires photographs of the occurrence and communicates them with the SAR team in the disaster area, allowing the team to take required action based on the information provided. D r o n e 2 delivers supplies to rescuers and individuals in a specific crisis region. S A R 1 , S A R 2 , and S A R 3 are assisted by D r o n e 1 and D r o n e 2 .
As demonstrated in Figure 8, Figure 9, Figure 10, Figure 11 and Figure 12, the network’s performance is evaluated using a variety of measures, including delay, load, routing traffic transmitted and received, and throughput. During the catastrophe, all numbers depict performance indicators for UAVs and responders. In addition, the routing traffic transmitted and received for each device and UAV is measured, as illustrated in Figure 8 and Figure 9. The efficiency of transmitting and receiving traffic between collaborative UAVs and responders is shown in both figures.
Figure 10, the throughput network performance is observed to depend on the UAVs. With UAVs in the coverage area of 1x1 Km, there is a significant increase in throughput, improving network performance, which agrees with other work that has been done [54].
The throughput refers to the total amount of data traffic that is successfully received and sent to the UAVs in bits per second. Increased throughput is achieved by increasing the packet size across all nodes. As a result, as seen in Figure 12, changing packet size results in various throughputs. When the number of nodes rises, however, the throughput increases.
As demonstrated in Figure 12, the time delay is impacted by the rising number of SAR nodes and decreases with the rise in packet size across all node counts. Each node has a distinct route and delay depending on the distance between the UAV and that particular node. In terms of minimal delay, S A R 2 has the shortest delay, while S A R 3 has the longest. However due to the line of sight, there is no delay for D r o n e 2 and D r o n e 1 .
Path loss propagation is another factor that requires careful treatment in the performance of UAVs and SAR. If the distance between the SAR and UAV is larger than the UAV coverage, then the interference caused by the SAR to the UAV is negligible because of the higher path loss and lower transmitting power of the SAR. For example, it can be seen from Figure 13 that the path loss increases with the different locations of D r o n e 1 and D r o n e 2 . Therefore, D r o n e 1 at the 100 m altitude achieves a path loss from 44 dB to 69 dB when the SAR distance varies from 100 m to 1000 m. On the other hand, D r o n e 2 at the 75 m altitude experiences a lower path loss from 36 dB to 62.5 dB, which varies with the distance from SAR, as shown in Figure 13. As a result, the UAV altitude has a dual effect on SAR distances.

8. Conclusions

There are four primary disaster management strategies: mitigation, preparedness, response, and recovery. UAVs are most commonly utilized in the response strategy. UAVs have emerged as one of the most promising and effective new technologies for disaster assistance SAR. UAV computing offers several advantages, including being simple to deploy, quick, safe, and efficient and effective, making it ideal for disaster recovery. Consequently, during a disaster, UAV computing intelligence assists SAR by providing SAR with better situational awareness, detecting survivors, operating computer modeling of the disaster, bringing required equipment and supplies, and minimizing loss of life, among many other applications. The findings showed that using UAV computing for assisted SAR missions in the disaster area can significantly improve situational awareness and help to locate areas with greatest risk based on captured images and computed information. We found that the network performance was efficient by observing the overall parametric evaluation of the effective throughput, delay, load, traffic sent and received, and path loss with different SAR distances. The path loss increases with the SAR distance. UAVs might utilize federated learning techniques to evaluate captured data locally instead of sending it to a disaster center. The transmission and processing of the picture to the UAV fog might be included in future research. Furthermore, resource allocation and energy consumption are the main challenges in UAV computing networks, which need to be addressed in the future. Tethered UAV computing may solve UAV computing battery life, but it may not be suitable for disaster management. Moreover, trajectory planning is one of the major UAV computing challenges in disasters and harsh environments that needs to be solved to avoid obstacles.

Author Contributions

Conceptualization, S.H.A., A.V.S., N.S.R. and A.H.; methodology, S.H.A., A.V.S. and S.V.S.; software, S.H.A., A.S., M.A.A. and S.K.; validation, S.H.A., A.V.S.; writing—original draft preparation, S.H.A., A.V.S., A.H., A.S. and S.K; writing—review and editing, N.S.R., S.H.A., S.S., S.K., V.O.N. and M.A.A.; visualization, S.H.A., S.V.S., V.O.N., M.A.A., S.S., S.V.S., A.S. and N.S.R.; supervision, A.H., S.K. and N.S.R.; project administration, A.V.S., S.V.S.; M.A.A. and S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

We acknowledge the support from the Science 596 Foundation Ireland (SFI) under Grant Number SFI/16/RC/3918 (Confirm) and the Marie Skłodowska-Curie 597 grant agreement No. 847577 co-funded by the European Regional Development Fund. This work was supported in part by the NCC Laboratory, Department of Electronics Engineering, IIT (BHU), India, under Grant IS/ST/EC-13-14/02 and I-DAPT HUB Foundation, IIT(BHU), India, under Grant R&D/SA/I-DAPT IIT(BHU)/ECE/21-22/02/290.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Hayat, S.; Yanmaz, E.; Muzaffar, R. Survey on unmanned aerial vehicle networks for civil applications: A communications viewpoint. IEEE Commun. Surv. Tutor. 2016, 18, 2624–2661. [Google Scholar] [CrossRef]
  2. Oubbati, O.S.; Lakas, A.; Lorenz, P.; Atiquzzaman, M.; Jamalipour, A. Leveraging communicating UAVs for emergency vehicle guidance in urban areas. IEEE Trans. Emerg. Top. Comput. 2019, 9, 1070–1082. [Google Scholar] [CrossRef]
  3. Huang, Z.; Chen, C.; Pan, M. Multiobjective UAV path planning for emergency information collection and transmission. IEEE Internet Things J. 2020, 7, 6993–7009. [Google Scholar] [CrossRef]
  4. Panda, K.G.; Das, S.; Sen, D.; Arif, W. Design and deployment of UAV-aided post-disaster emergency network. IEEE Access 2019, 7, 102985–102999. [Google Scholar] [CrossRef]
  5. Suzuki, T.; Meguro, J.; Amano, Y.; Hashizume, T.; Hirokawa, R.; Tatsumi, K.; Sato, K.; Takiguchi, J.-i. Information collecting system based on aerial images obtained by a small UAV for disaster prevention. In ICMIT 2007: Mechatronics, MEMS, and Smart Materials; SPIE: Bellingham, WA, USA, 2008; Volume 6794, pp. 538–543. [Google Scholar]
  6. Bejiga, M.B.; Zeggada, A.; Nouffidj, A.; Melgani, F. A convolutional neural network approach for assisting avalanche search and rescue operations with UAV imagery. Remote Sens. 2017, 9, 100. [Google Scholar] [CrossRef] [Green Version]
  7. Al-Kaff, A.; Gómez-Silva, M.J.; Moreno, F.M.; De La Escalera, A.; Armingol, J.M. An appearance-based tracking algorithm for aerial search and rescue purposes. Sensors 2019, 19, 652. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  8. Lygouras, E.; Santavas, N.; Taitzoglou, A.; Tarchanidis, K.; Mitropoulos, A.; Gasteratos, A. Unsupervised human detection with an embedded vision system on a fully autonomous UAV for search and rescue operations. Sensors 2019, 19, 3542. [Google Scholar] [CrossRef] [Green Version]
  9. Almalki, F.A.; Soufiene, B.O.; Alsamhi, S.H.; Sakli, H. A low-cost platform for environmental smart farming monitoring system based on IoT and UAVs. Sustainability 2021, 13, 5908. [Google Scholar] [CrossRef]
  10. Kaufmann, V.; Kellerer-Pirklbauer, A.; Seier, G. Conventional and UAV-Based Aerial Surveys for Long-Term Monitoring (1954–2020) of a Highly Active Rock Glacier in Austria. Front. Remote. Sens. 2021, 2, 732744. [Google Scholar] [CrossRef]
  11. Alsamhi, S.H.; Ma, O.; Ansari, M.S.; Almalki, F.A. Survey on collaborative smart drones and internet of things for improving smartness of smart cities. IEEE Access 2019, 7, 128125–128152. [Google Scholar] [CrossRef]
  12. Alsamhi, S.H.; Almalki, F.A.; Ma, O.; Ansari, M.S.; Angelides, M.C. Performance optimization of tethered balloon technology for public safety and emergency communications. Telecommun. Syst. 2020, 75, 235–244. [Google Scholar] [CrossRef]
  13. Alsamhi, S.H.; Almalki, F.A.; AL-Dois, H.; Shvetsov, A.V.; Ansari, M.S.; Hawbani, A.; Gupta, S.K.; Lee, B. Multi-Drone Edge Intelligence and SAR Smart Wearable Devices for Emergency Communication. Wirel. Commun. Mob. Comput. 2021, 2021, 6710074. [Google Scholar] [CrossRef]
  14. Alsamhi, S.H.; Ansari, M.S.; Rajput, N.S. Disaster coverage predication for the emerging tethered balloon technology: Capability for preparedness, detection, mitigation, and response. Disaster Med. Public Health Prep. 2018, 12, 222–231. [Google Scholar] [CrossRef] [PubMed]
  15. Alsamhi, S.H.; Lee, B.; Guizani, M.; Kumar, N.; Qiao, Y.; Liu, X. Blockchain for decentralized multi-drone to combat COVID-19 and future pandemics: Framework and proposed solutions. Trans. Emerg. Telecommun. Technol. 2021, 32, e4255. [Google Scholar] [CrossRef]
  16. Saif, A.; Dimyati, K.; Noordin, K.A.; Shah, N.S.M.; Alsamhi, S.H.; Abdullah, Q.; Farah, N. Distributed clustering for user devices under UAV coverage area during disaster recovery. In Proceedings of the 2021 IEEE International Conference in Power Engineering Application (ICPEA), Shah Alam, Malaysia, 8–9 March 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 143–148. [Google Scholar]
  17. Zhou, J.; Yang, J.; Lu, L. Research on multi-UAV networks in disaster emergency communication. In IOP Conference Series: Materials Science and Engineering; IOP Publishing: Bristol, UK, 2020; Volume 719. [Google Scholar]
  18. Alsamhi, S.H.; Ma, O.; Ansari, M. Predictive estimation of the optimal signal strength from unmanned aerial vehicle over internet of things using ANN. arXiv 2018, arXiv:1805.07614. [Google Scholar]
  19. Alsamhi, S.H.; Ma, O.; Ansari, M. Artificial intelligence-based techniques for emerging robotics communication: A survey and future perspectives. arXiv 2018, arXiv:1804.09671. [Google Scholar]
  20. Alsamhi, S.H.; Almalki, F.; Ma, O.; Ansari, M.S.; Lee, B. Predictive estimation of optimal signal strength from drones over IoT frameworks in smart cities. IEEE Trans. Mob. Comput. 2021. [Google Scholar] [CrossRef]
  21. Krishnan, S.; Boroujerdian, B.; Fu, W.; Faust, A.; Reddi, V.J. Air Learning: A deep reinforcement learning gym for autonomous aerial robot visual navigation. Mach. Learn. 2021, 110, 2501–2540. [Google Scholar] [CrossRef]
  22. Zeng, T.; Semiari, O.; Mozaffari, M.; Chen, M.; Saad, W.; Bennis, M. Federated learning in the sky: Joint power allocation and scheduling with UAV swarms. In Proceedings of the ICC 2020 IEEE International Conference on Communications (ICC), Dublin, Ireland, 7–11 June 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 1–6. [Google Scholar]
  23. Lim, W.Y.B.; Huang, J.; Xiong, Z.; Kang, J.; Niyato, D.; Hua, X.S.; Leung, C.; Miao, C. Towards federated learning in uav-enabled internet of vehicles: A multi-dimensional contract-matching approach. IEEE Trans. Intell. Transp. Syst. 2021, 22, 5140–5154. [Google Scholar] [CrossRef]
  24. Qu, Y.; Dai, H.; Zhuang, Y.; Chen, J.; Dong, C.; Wu, F.; Guo, S. Decentralized Federated Learning for UAV Networks: Architecture, Challenges, and Opportunities. IEEE Netw. 2021, 35, 156–162. [Google Scholar] [CrossRef]
  25. Tursunboev, J.; Kang, Y.S.; Huh, S.B.; Lim, D.W.; Kang, J.M.; Jung, H. Hierarchical Federated Learning for Edge-Aided Unmanned Aerial Vehicle Networks. Appl. Sci. 2022, 12, 670. [Google Scholar] [CrossRef]
  26. Kang, J.; Xiong, Z.; Niyato, D.; Xie, S.; Kim, D.I. Securing data sharing from the sky: Integrating blockchains into drones in 5G and beyond. IEEE Netw. 2021, 35, 78–85. [Google Scholar] [CrossRef]
  27. Zhu, C.; Zhu, X.; Ren, J.; Qin, T. Blockchain-Enabled Federated Learning for UAV Edge Computing Network: Issues and Solutions. IEEE Access 2022, 10, 56591–56610. [Google Scholar] [CrossRef]
  28. Alsamhi, S.H.; Almalki, F.A.; Afghah, F.; Hawbani, A.; Shvetsov, A.V.; Lee, B.; Song, H. Drones’ Edge Intelligence over Smart Environments in B5G: Blockchain and Federated Learning Synergy. IEEE Trans. Green Commun. Netw. 2021, 6, 295–312. [Google Scholar]
  29. Yazid, Y.; Ez-Zazi, I.; Guerrero-González, A.; El Oualkadi, A.; Arioua, M. UAV-Enabled Mobile Edge-Computing for IoT Based on AI: A Comprehensive Review. Drones 2021, 5, 148. [Google Scholar] [CrossRef]
  30. Thiels, C.A.; Aho, J.M.; Zietlow, S.P.; Jenkins, D.H. Use of unmanned aerial vehicles for medical product transport. Air Med J. 2015, 34, 104–108. [Google Scholar] [CrossRef]
  31. Liu, Y.; Nejat, G. Multirobot cooperative learning for semiautonomous control in urban search and rescue applications. J. Field Robot. 2016, 33, 512–536. [Google Scholar] [CrossRef]
  32. Amukele, T.K.; Hernandez, J.; Snozek, C.L.; Wyatt, R.G.; Douglas, M.; Amini, R.; Street, J. Drone transport of chemistry and hematology samples over long distances. Am. J. Clin. Pathol. 2017, 148, 427–435. [Google Scholar] [CrossRef] [Green Version]
  33. Claesson, A.; Bäckman, A.; Ringh, M.; Svensson, L.; Nordberg, P.; Djärv, T.; Hollenberg, J. Time to delivery of an automated external defibrillator using a drone for simulated out-of-hospital cardiac arrests vs emergency medical services. JAMA 2017, 317, 2332–2334. [Google Scholar] [CrossRef]
  34. Dinama, D.M.; A’yun, Q.; Syahroni, A.D.; Sulistijono, I.A.; Risnumawan, A. Human detection and tracking on surveillance video footage using convolutional neural networks. In Proceedings of the 2019 International Electronics Symposium (IES), Surabaya, Indonesia, 27–28 September 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 534–538. [Google Scholar]
  35. Yousif, H.; Yuan, J.; Kays, R.; He, Z. Fast human-animal detection from highly cluttered camera-trap images using joint background modeling and deep learning classification. In Proceedings of the 2017 IEEE international symposium on circuits and systems (ISCAS), Baltimore, MD, USA, 28–31 May 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–4. [Google Scholar]
  36. Erdelj, M.; Król, M.; Natalizio, E. Wireless sensor networks and multi-UAV systems for natural disaster management. Comput. Networks 2017, 124, 72–86. [Google Scholar] [CrossRef]
  37. Alsamhi, S.H.; Shvetsov, A.V.; Kumar, S.; Hassan, H.; Alhartomi, M.; Shvetsova, S.V.; Sahal, R.; Hawbani, A.; Curry, E. Computing in the Sky: A Survey on Intelligent Ubiquitous Computing for UAV-Assisted 6G Networks and Industry 4.0. Drones 2022, in press. [Google Scholar]
  38. Reina, D.; Camp, T.; Munjal, A.; Toral, S.; Tawfik, H. Evolutionary deployment and hill climbing-based movements of multi-UAV networks in disaster scenarios. In Applications of Big Data Analytics; Springer: Berlin/Heidelberg, Germany, 2018; pp. 63–95. [Google Scholar]
  39. Gopi, S.P.; Magarini, M.; Shvetsov, A.V. Machine Learning-Assisted Adaptive Modulation for Optimized Drone-User Communication in B5G. Drones 2021, 5, 128. [Google Scholar] [CrossRef]
  40. Grodi, R.; Rawat, D.B.; Bajracharya, C. Performance evaluation of unmanned aerial vehicle ad hoc networks. In SoutheastCon 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1–4. [Google Scholar]
  41. Khaleefa, S.; Alsamhi, S.H.; Rajput, N. Tethered balloon technology for telecommunication, coverage and path loss. In Proceedings of the 2014 IEEE Students’ Conference on Electrical, Electronics and Computer Science, Bhopal, India, 1–2 March 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 1–4. [Google Scholar]
  42. Alsamhi, S.; Rajput, N. HAP antenna radiation pattern for providing coverage and service characteristics. 2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Delhi, India, 24-27 September 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 1434–1439. [Google Scholar]
  43. Alsamhi, S.; Rajput, N. Methodology for coexistence of high altitude platform ground stations and radio relay stations with reduced interference. Int. J. Sci. Eng. Res. 2012, 3, 1–7. [Google Scholar]
  44. Sánchez-García, J.; García-Campos, J.; Arzamendia, M.; Reina, D.G.; Toral, S.; Gregor, D. A survey on unmanned aerial and aquatic vehicle multi-hop networks: Wireless communications, evaluation tools and applications. Comput. Commun. 2018, 119, 43–65. [Google Scholar] [CrossRef]
  45. Al-Turjman, F.; Zahmatkesh, H.; Al-Oqily, I.; Daboul, R. Optimized unmanned aerial vehicles deployment for static and mobile targets’ monitoring. Comput. Commun. 2020, 149, 27–35. [Google Scholar] [CrossRef]
  46. Rudol, P.; Doherty, P. Human body detection and geolocalization for UAV search and rescue missions using color and thermal imagery. In Proceedings of the 2008 IEEE Aerospace Conference, Big Sky, MT, USA, 1–8 March 2008; IEEE: Piscataway, NJ, USA, 2008; pp. 1–8. [Google Scholar]
  47. Rizk, M.; Slim, F.; Charara, J. Toward AI-Assisted UAV for Human Detection in Search and Rescue Missions. In Proceedings of the 2021 International Conference on Decision Aid Sciences and Application (DASA), Virtual, 7–8 December 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 781–786. [Google Scholar]
  48. da Silva, W.; Vijaykumar, N.L.; Sandri, S.A.; de Campos Velho, H.F.; Sjanic, Z.; Shiguemori, E.H.; Saotome, O. Image Edge Extraction by Artificial Intelligence Schemes for UAV Autonomous Navigation. Proceeding Ser. Braz. Soc. Comput. Appl. Math. 2020, 7, 1–7. [Google Scholar]
  49. Gupta, A.; Sundhan, S.; Gupta, S.K.; Rashid, M. Collaboration of UAV and HetNet for better QoS: A comparative study. Int. J. Veh. Inf. Commun. Syst. 2020, 5, 309–333. [Google Scholar]
  50. Alsamhi, S.; Almalki, F.A.; Al-Dois, H.; Ben Othman, S.; Hassan, J.; Hawbani, A.; Sahal, R.; Lee, B.; Saleh, H. Machine learning for smart environments in B5G networks: Connectivity and QoS. Comput. Intell. Neurosci. 2021, 2021, 6805151. [Google Scholar] [CrossRef]
  51. Jain, S.; Fall, K.; Patra, R. Routing in a delay tolerant network. In Proceedings of the 2004 Conference on Applications, Technologies, Architectures, and Protocols for Computer Communications, Portland, OR, USA, 30 August–3 September 2004; pp. 145–158. [Google Scholar]
  52. Khan, M.A.; Qureshi, I.M.; Khanzada, F. A hybrid communication scheme for efficient and low-cost deployment of future flying ad-hoc network (FANET). Drones 2019, 3, 16. [Google Scholar] [CrossRef] [Green Version]
  53. Nguyen, U.T.; Xiong, X. Rate-adaptive multicast in mobile ad-hoc networks. In Proceedings of the WiMob’2005, IEEE International Conference on Wireless And Mobile Computing, Networking And Communications, Montreal, QC, Canada, 22–24 August 2005; IEEE: Piscataway, NJ, USA, 2005; Volume 3, pp. 352–360. [Google Scholar]
  54. Tuli, E.A.; Golam, M.; Kim, D.S.; Lee, J.M. Performance Enhancement of Optimized Link State Routing Protocol by Parameter Configuration for UANET. Drones 2022, 6, 22. [Google Scholar] [CrossRef]
Figure 1. Paper structure.
Figure 1. Paper structure.
Drones 06 00154 g001
Figure 2. Disaster management strategies.
Figure 2. Disaster management strategies.
Drones 06 00154 g002
Figure 3. UAV computing-assisted disaster area.
Figure 3. UAV computing-assisted disaster area.
Drones 06 00154 g003
Figure 4. Artificial neural network model.
Figure 4. Artificial neural network model.
Drones 06 00154 g004
Figure 5. Drones computing intelligence for assisting SAR in disaster management.
Figure 5. Drones computing intelligence for assisting SAR in disaster management.
Drones 06 00154 g005
Figure 6. Workflow of experiment.
Figure 6. Workflow of experiment.
Drones 06 00154 g006
Figure 7. Network components and architecture of the proposed experiment.
Figure 7. Network components and architecture of the proposed experiment.
Drones 06 00154 g007
Figure 8. Network routing traffic sent.
Figure 8. Network routing traffic sent.
Drones 06 00154 g008
Figure 9. Network routing traffic received from different nodes.
Figure 9. Network routing traffic received from different nodes.
Drones 06 00154 g009
Figure 10. Network throughput of different nodes.
Figure 10. Network throughput of different nodes.
Drones 06 00154 g010
Figure 11. Network delay of different nodes.
Figure 11. Network delay of different nodes.
Drones 06 00154 g011
Figure 12. Network load for different nodes.
Figure 12. Network load for different nodes.
Drones 06 00154 g012
Figure 13. Path losses from different SAR distance.
Figure 13. Path losses from different SAR distance.
Drones 06 00154 g013
Table 1. Summary of current works.
Table 1. Summary of current works.
Ref.Highlight[A][B][C][D][E]
[21] (2021)Air learning for autonomous UAVs×××
[22] (2020)FL for scheduling and power allocation cooperation of UAV swarms×××
[24] (2021)Challenges and architecture of decentralized FL in UAV networks×××
[26] (2021)Securing data sharing for UAV computing networks××××
[27] (2020)Performance of FL for edge-assisted UAV networks with data sharing×××
[29] (2021)UAV-enabled edge computing to support IoT devices××
[28] (2021)Combination of FL and Blockchain for drone edge intelligence over smart environments××
Our workEvolution of the network performance of UAV computing to expedite SAR missions to expedite SAR missions
[A] = AI in UAV, [B] = UAV computing network, [C] = SAR, [D] = Network performance, [E] = UAV computing for disasters and harsh environments.
Table 2. Simulation environment.
Table 2. Simulation environment.
ItemsValue
SimulatorOPNET-14.5
Dimension area1 × 1 km
ProtocolsAODV
Packet size1024 bytes
Speed128
Node typesMobile
No. SAR nodes4
No. UAV computing nodes2
No. disaster center nodes1
Update interval500,000 events
Data rate11 Mbps, 2 Mbps
Simulation time1000 s
Value per statistic100
Packet intervalExponential (1)
No. of coordinator nodes7
Path lossHata model
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Alsamhi, S.H.; Shvetsov, A.V.; Kumar, S.; Shvetsova, S.V.; Alhartomi, M.A.; Hawbani, A.; Rajput, N.S.; Srivastava, S.; Saif, A.; Nyangaresi, V.O. UAV Computing-Assisted Search and Rescue Mission Framework for Disaster and Harsh Environment Mitigation. Drones 2022, 6, 154. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6070154

AMA Style

Alsamhi SH, Shvetsov AV, Kumar S, Shvetsova SV, Alhartomi MA, Hawbani A, Rajput NS, Srivastava S, Saif A, Nyangaresi VO. UAV Computing-Assisted Search and Rescue Mission Framework for Disaster and Harsh Environment Mitigation. Drones. 2022; 6(7):154. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6070154

Chicago/Turabian Style

Alsamhi, Saeed Hamood, Alexey V. Shvetsov, Santosh Kumar, Svetlana V. Shvetsova, Mohammed A. Alhartomi, Ammar Hawbani, Navin Singh Rajput, Sumit Srivastava, Abdu Saif, and Vincent Omollo Nyangaresi. 2022. "UAV Computing-Assisted Search and Rescue Mission Framework for Disaster and Harsh Environment Mitigation" Drones 6, no. 7: 154. https://0-doi-org.brum.beds.ac.uk/10.3390/drones6070154

Article Metrics

Back to TopTop