Advances in SLAM and Data Fusion for UAVs/Drones

A special issue of Drones (ISSN 2504-446X).

Deadline for manuscript submissions: closed (31 December 2021) | Viewed by 37985

Special Issue Editors


E-Mail Website
Guest Editor
Department of Radiology, Cumming School of Medicine, University of Calgary, Calgary, AB T2N 1N4, Canada
Interests: photogrammetric computer vision; biomedical imaging; LiDAR; IMU; mobile robotics; simultaneous localization and mapping (SLAM); machine learning; sensor calibration; sensor fusion, and numerical optimization
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Geomatics Engineering, University of Calgary, 2500 University Dr. NW, Calgary, AB T2N 1N4, Canada
Interests: vision-guided unmanned aerial systems; integration and calibration of ranging and imaging technologies; deep learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Leica Geosystems Inc., Calgary, AB T2E 8Z9, Canada
Interests: SLAM; computer vision; inertial navigation systems; signal processing; point cloud processing; LiDAR; calibration
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Unmanned aerial vehicles (UAVs) equipped with a variety of sensors can safely acquire real-time, high-resolution sensory data of the environment. This data can be applied in various fields, such as construction, agriculture, entertainment, and transportation. For many of these applications, the autonomy of the drone is of critical importance. UAVs need to be equipped with reliable localization, navigation, and exploration capabilities in order to make sense of complex environments by themselves with little/no interventions from operators. The users of UAV technologies are also inundated with overwhelming amounts of data (e.g., large volumes of imagery). Therefore, intelligent algorithms are needed to control the overflow of data by fusing and transforming disparate data into useful and concise information.

This Special Issue captures the state-of-the-art and emerging solutions for the localization and navigation of UAVs, as well as intelligent processing of data from the miniature sensors onboard.

You may choose our Joint Special Issue in Electronics.

Dr. Jacky C.K. Chow
Dr. Mozhdeh Shahbazi
Dr. Ajeesh Kurian
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Drones is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • unmanned aerial vehicles
  • mapping and navigation
  • computer vision, photogrammetry, and remote sensing
  • control systems
  • signal processing
  • GNSS, IMU, UWB, BLE, Sonar, Radar, LiDAR and cameras
  • 2D/3D image and point cloud processing
  • image orientation
  • sensor/data fusion
  • machine learning and deep learning.

Related Special Issue

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 125145 KiB  
Article
A ROS Multi-Tier UAV Localization Module Based on GNSS, Inertial and Visual-Depth Data
by Angelos Antonopoulos, Michail G. Lagoudakis and Panagiotis Partsinevelos
Drones 2022, 6(6), 135; https://0-doi-org.brum.beds.ac.uk/10.3390/drones6060135 - 24 May 2022
Cited by 11 | Viewed by 4420
Abstract
Uncrewed aerial vehicles (UAVs) are continuously gaining popularity in a wide spectrum of applications, while their positioning and navigation most often relies on Global Navigation Satellite Systems (GNSS). However, numerous conditions and practices require UAV operation in GNSS-denied environments, including confined spaces, urban [...] Read more.
Uncrewed aerial vehicles (UAVs) are continuously gaining popularity in a wide spectrum of applications, while their positioning and navigation most often relies on Global Navigation Satellite Systems (GNSS). However, numerous conditions and practices require UAV operation in GNSS-denied environments, including confined spaces, urban canyons, vegetated areas and indoor places. For the purposes of this study, an integrated UAV navigation system was designed and implemented which utilizes GNSS, visual, depth and inertial data to provide real-time localization. The implementation is built as a package for the Robotic Operation System (ROS) environment to allow ease of integration in various systems. The system can be autonomously adjusted to the flight environment, providing spatial awareness to the aircraft. This system expands the functionality of UAVs, as it enables navigation even in GNSS-denied environments. This integrated positional system provides the means to support fully autonomous navigation under mixed environments, or malfunctioning conditions. Experiments show the capability of the system to provide adequate results in open, confined and mixed spaces. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

35 pages, 1287 KiB  
Article
Simultaneous Localization and Mapping (SLAM) and Data Fusion in Unmanned Aerial Vehicles: Recent Advances and Challenges
by Abhishek Gupta and Xavier Fernando
Drones 2022, 6(4), 85; https://0-doi-org.brum.beds.ac.uk/10.3390/drones6040085 - 28 Mar 2022
Cited by 53 | Viewed by 13961
Abstract
This article presents a survey of simultaneous localization and mapping (SLAM) and data fusion techniques for object detection and environmental scene perception in unmanned aerial vehicles (UAVs). We critically evaluate some current SLAM implementations in robotics and autonomous vehicles and their applicability and [...] Read more.
This article presents a survey of simultaneous localization and mapping (SLAM) and data fusion techniques for object detection and environmental scene perception in unmanned aerial vehicles (UAVs). We critically evaluate some current SLAM implementations in robotics and autonomous vehicles and their applicability and scalability to UAVs. SLAM is envisioned as a potential technique for object detection and scene perception to enable UAV navigation through continuous state estimation. In this article, we bridge the gap between SLAM and data fusion in UAVs while also comprehensively surveying related object detection techniques such as visual odometry and aerial photogrammetry. We begin with an introduction to applications where UAV localization is necessary, followed by an analysis of multimodal sensor data fusion to fuse the information gathered from different sensors mounted on UAVs. We then discuss SLAM techniques such as Kalman filters and extended Kalman filters to address scene perception, mapping, and localization in UAVs. The findings are summarized to correlate prevalent and futuristic SLAM and data fusion for UAV navigation, and some avenues for further research are discussed. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

16 pages, 4792 KiB  
Article
Assessment of Android Network Positioning as an Alternative Source of Navigation for Drone Operations
by Dong-Kyeong Lee, Filip Nedelkov and Dennis M. Akos
Drones 2022, 6(2), 35; https://doi.org/10.3390/drones6020035 - 23 Jan 2022
Cited by 3 | Viewed by 3406
Abstract
Applications of drones have increased significantly in the past decade for both indoor and outdoor operations. In order to assist autonomous drone navigation, there are numerous sensors installed onboard the vehicles. These include Global Navigation Satellite Systems (GNSS) chipsets, inertial sensors, barometer, lidar, [...] Read more.
Applications of drones have increased significantly in the past decade for both indoor and outdoor operations. In order to assist autonomous drone navigation, there are numerous sensors installed onboard the vehicles. These include Global Navigation Satellite Systems (GNSS) chipsets, inertial sensors, barometer, lidar, radar and vision sensors. The two sensors used most often by drone autopilot controllers for absolute positioning are the GNSS chipsets and barometer. Although, for most outdoor operations, these sensors provide accurate and reliable position information, their accuracy, availability, and integrity deteriorate for indoor applications and in the presence of radio frequency interference (RFI), such as GNSS spoofing and jamming. Therefore, it is possible to derive network-based locations from Wi-Fi and cellular transmission. Although there have been many theoretical studies on network positioning, limited resources are available for the expected quantitative performance of these positioning methodologies. In this paper, the authors investigate both the horizontal and vertical accuracy of the Android network location engines under rural, suburban, and urban environments. The paper determines the horizontal location accuracy to be approximately 1637 m, 38 m, and 32 m in terms of 68% circular error probable (CEP) for rural, suburban, and urban environments, respectively, and the vertical accuracy to be 1.2 m and 4.6 m in terms of 68% CEP for suburban and urban environments, respectively. In addition, the availability and latency of the location engines are explored. Furthermore, the paper assesses the accuracy of the Android network location accuracy indicator for various drone operation environments. The assessed accuracies of the network locations provide a deeper insight into their potential for drone navigation. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

17 pages, 3326 KiB  
Article
DAGmap: Multi-Drone SLAM via a DAG-Based Distributed Ledger
by Seongjoon Park and Hwangnam Kim
Drones 2022, 6(2), 34; https://0-doi-org.brum.beds.ac.uk/10.3390/drones6020034 - 20 Jan 2022
Cited by 6 | Viewed by 3544
Abstract
Simultaneous localization and mapping (SLAM) in unmanned vehicles, such as drones, has great usability potential in versatile applications. When operating SLAM in multi-drone scenarios, collecting and sharing the map data and deriving converged maps are major issues (regarded as the bottleneck of the [...] Read more.
Simultaneous localization and mapping (SLAM) in unmanned vehicles, such as drones, has great usability potential in versatile applications. When operating SLAM in multi-drone scenarios, collecting and sharing the map data and deriving converged maps are major issues (regarded as the bottleneck of the system). This paper presents a novel approach that utilizes the concepts of distributed ledger technology (DLT) for enabling the online map convergence of multiple drones without a centralized station. As DLT allows each agent to secure a collective database of valid transactions, DLT-powered SLAM can let each drone secure global 3D map data and utilize these data for navigation. However, block-based DLT—a so called blockchain—may not fit well to the multi-drone SLAM due to the restricted data structure, discrete consensus, and high power consumption. Thus, we designed a multi-drone SLAM system that constructs a DAG-based map database and sifts the noisy 3D points based on the DLT philosophy, named DAGmap. Considering the differences between currency transactions and data constructions, we designed a new strategy for data organization, validation, and a consensus framework under the philosophy of DAG-based DLT. We carried out a numerical analysis of the proposed system with an off-the-shelf camera and drones. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

19 pages, 12317 KiB  
Article
A New Visual Inertial Simultaneous Localization and Mapping (SLAM) Algorithm Based on Point and Line Features
by Tong Zhang, Chunjiang Liu, Jiaqi Li, Minghui Pang and Mingang Wang
Drones 2022, 6(1), 23; https://0-doi-org.brum.beds.ac.uk/10.3390/drones6010023 - 13 Jan 2022
Cited by 12 | Viewed by 3680
Abstract
In view of traditional point-line feature visual inertial simultaneous localization and mapping (SLAM) system, which has weak performance in accuracy so that it cannot be processed in real time under the condition of weak indoor texture and light and shade change, this paper [...] Read more.
In view of traditional point-line feature visual inertial simultaneous localization and mapping (SLAM) system, which has weak performance in accuracy so that it cannot be processed in real time under the condition of weak indoor texture and light and shade change, this paper proposes an inertial SLAM method based on point-line vision for indoor weak texture and illumination. Firstly, based on Bilateral Filtering, we apply the Speeded Up Robust Features (SURF) point feature extraction and Fast Nearest neighbor (FLANN) algorithms to improve the robustness of point feature extraction result. Secondly, we establish a minimum density threshold and length suppression parameter selection strategy of line feature, and take the geometric constraint line feature matching into consideration to improve the efficiency of processing line feature. And the parameters and biases of visual inertia are initialized based on maximum posterior estimation method. Finally, the simulation experiments are compared with the traditional tightly-coupled monocular visual–inertial odometry using point and line features (PL-VIO) algorithm. The simulation results demonstrate that the proposed an inertial SLAM method based on point-line vision for indoor weak texture and illumination can be effectively operated in real time, and its positioning accuracy is 22% higher on average and 40% higher in the scenario that illumination changes and blurred image. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

22 pages, 4756 KiB  
Article
Self-Localization of Tethered Drones without a Cable Force Sensor in GPS-Denied Environments
by Amer Al-Radaideh and Liang Sun
Drones 2021, 5(4), 135; https://0-doi-org.brum.beds.ac.uk/10.3390/drones5040135 - 17 Nov 2021
Cited by 16 | Viewed by 4214
Abstract
This paper considers the self-localization of a tethered drone without using a cable-tension force sensor in GPS-denied environments. The original problem is converted to a state-estimation problem, where the cable-tension force and the three-dimensional position of the drone with respect to a ground [...] Read more.
This paper considers the self-localization of a tethered drone without using a cable-tension force sensor in GPS-denied environments. The original problem is converted to a state-estimation problem, where the cable-tension force and the three-dimensional position of the drone with respect to a ground platform are estimated using an extended Kalman filter (EKF). The proposed approach uses the data reported by the onboard electric motors (i.e., the pulse width modulation (PWM) signals), accelerometers, gyroscopes, and altimeter, embedded in the commercial-of-the-shelf (COTS) inertial measurement units (IMU). A system-identification experiment was conducted to determine the model that computes the drone thrust force using the PWM signals. The proposed approach was compared with an existing work that assumes known cable-tension force. Simulation results show that the proposed approach produces estimates with less than 0.3-m errors when the actual cable-tension force is greater than 1 N. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

19 pages, 4704 KiB  
Article
Prototype Development of Cross-Shaped Microphone Array System for Drone Localization Based on Delay-and-Sum Beamforming in GNSS-Denied Areas
by Hirokazu Madokoro, Satoshi Yamamoto, Kanji Watanabe, Masayuki Nishiguchi, Stephanie Nix, Hanwool Woo and Kazuhito Sato
Drones 2021, 5(4), 123; https://0-doi-org.brum.beds.ac.uk/10.3390/drones5040123 - 23 Oct 2021
Cited by 5 | Viewed by 2501
Abstract
Drones equipped with a global navigation satellite system (GNSS) receiver for absolute localization provide high-precision autonomous flight and hovering. However, the GNSS signal reception sensitivity is considerably lower in areas such as those between high-rise buildings, under bridges, and in tunnels. This paper [...] Read more.
Drones equipped with a global navigation satellite system (GNSS) receiver for absolute localization provide high-precision autonomous flight and hovering. However, the GNSS signal reception sensitivity is considerably lower in areas such as those between high-rise buildings, under bridges, and in tunnels. This paper presents a drone localization method based on acoustic information using a microphone array in GNSS-denied areas. Our originally developed microphone array system comprised 32 microphones installed in a cross-shaped configuration. Using drones of two different sizes and weights, we obtained an original acoustic outdoor benchmark dataset at 24 points. The experimentally obtained results revealed that the localization error values were lower for 0 and ±45 than for ±90. Moreover, we demonstrated the relative accuracy for acceptable ranges of tolerance for the obtained localization error values. Full article
(This article belongs to the Special Issue Advances in SLAM and Data Fusion for UAVs/Drones)
Show Figures

Figure 1

Back to TopTop