Laser Sensing in Robotics

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Optics and Lasers".

Deadline for manuscript submissions: closed (31 December 2021) | Viewed by 14826

Special Issue Editor

Research and Development, Maritime State University, Vladivostok, Russia
Interests: LIDAR; laser-matter interaction; LIF; LIBS; underwater robotics; laser sensing; artificial intelligence

Special Issue Information

Laser sensing has a long and well-studied history of development, and continues to improve. Laser sensing has widespread applications in diverse fields in science and industry. At present, there are new achievements in lasers, spectroscopy, optic materials, in the interaction of laser radiation and matter, and in data retrieval techniques. Methods of artificial intelligence and data processing are rapidly advancing and increasing in number. These achievements and methods may be unified in robotics to improve the technology and reach its maximum potential.

The interaction of lasers with matter presents detailed information about the environment. Artificial intelligence can be applied in laser sensing to receive and evaluate the information about the environment, make decisions, and solve tasks autonomously. The application of AI in laser sensing could improve the functionality of robotic systems and endow them with olfaction, positioning, vision, navigation, etc. 

The scope of this Special Issue, “Laser Sensing in Robotics”, includes the following main topics, but is not limited to these:

  • New laser and optical sensors in robotics: LIDAR, laser scanners, laser imaging, laser spectroscopy sensors, optical sensors, etc.
  • Research of laser–matter interaction for the development of new methods for laser sensing in robotics.
  • New materials development, development of new research methods (measurement principles: absorption, fluorescence, reflection, scattering, photoacoustic, etc.) for laser sensors in robotics.
  • Image processing methods for optical and laser sensors (imaging techniques for spatially and temporally resolved information, spectral analysis) in robotics.
  • Artificial intelligence methods for receiving and evaluating information from laser sensing; intelligent sensors.
  • The application of laser sensing and smart laser sensors in environmental monitoring, industrial process monitoring, agriculture, security by air crafts, vessels and underwater vehicles, workplace safety, food science, biology, and health services.

Prof. Dr. Oleg A. Bukin
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Laser sensing
  • Artificial intelligence
  • Laser–matter interaction
  • Robotics
  • Laser and optical sensors
  • Data processing and data retrieval
  • Smart sensors

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

16 pages, 2095 KiB  
Article
Development of a Mobile Robot That Plays Tag with Touch-and-Away Behavior Using a Laser Range Finder
by Yoshitaka Kasai, Yutaka Hiroi, Kenzaburo Miyawaki and Akinori Ito
Appl. Sci. 2021, 11(16), 7522; https://0-doi-org.brum.beds.ac.uk/10.3390/app11167522 - 17 Aug 2021
Cited by 2 | Viewed by 1743
Abstract
The development of robots that play with humans is a challenging topic for robotics. We are developing a robot that plays tag with human players. To realize such a robot, it needs to observe the players and obstacles around it, chase a target [...] Read more.
The development of robots that play with humans is a challenging topic for robotics. We are developing a robot that plays tag with human players. To realize such a robot, it needs to observe the players and obstacles around it, chase a target player, and touch the player without collision. To achieve this task, we propose two methods. The first one is the player tracking method, by which the robot moves towards a virtual circle surrounding the target player. We used a laser range finder (LRF) as a sensor for player tracking. The second one is a motion control method after approaching the player. Here, the robot moves away from the player by moving towards the opposite side to the player. We conducted a simulation experiment and an experiment using a real robot. Both experiments proved that with the proposed tracking method, the robot properly chased the player and moved away from the player without collision. The contribution of this paper is the development of a robot control method to approach a human and then move away safely. Full article
(This article belongs to the Special Issue Laser Sensing in Robotics)
Show Figures

Graphical abstract

14 pages, 8796 KiB  
Article
Curve-Localizability-SVM Active Localization Research for Mobile Robots in Outdoor Environments
by Liang Gong, Xiangyu Yu and Jingchuan Wang
Appl. Sci. 2021, 11(10), 4362; https://0-doi-org.brum.beds.ac.uk/10.3390/app11104362 - 11 May 2021
Cited by 4 | Viewed by 1407
Abstract
Working environment of mobile robots has gradually expanded from indoor structured scenes to outdoor scenes such as wild areas in recent years. The expansion of application scene, change of sensors and the diversity of working tasks bring greater challenges and higher demands to [...] Read more.
Working environment of mobile robots has gradually expanded from indoor structured scenes to outdoor scenes such as wild areas in recent years. The expansion of application scene, change of sensors and the diversity of working tasks bring greater challenges and higher demands to active localization for mobile robots. The efficiency and stability of traditional localization strategies in wild environments are significantly reduced. On the basis of considering features of the environment and the robot motion curved surface, this paper proposes a curve-localizability-SVM active localization algorithm. Firstly, we present a curve-localizability-index based on 3D observation model, and then based on this index, a curve-localizability-SVM path planning strategy and an improved active localization method are proposed. Obtained by setting the constraint space and objective function of the planning algorithm, where curve-localizability is the main constraint, the path helps improve the convergence speed and stability in complex environments of the active localization algorithm. Helped by SVM, the path is smoother and safer for large robots. The algorithm was tested by comparative experiments and analysis in real environment and robot platform, which verified the improvement of efficiency and stability of the new strategy. Full article
(This article belongs to the Special Issue Laser Sensing in Robotics)
Show Figures

Figure 1

20 pages, 4758 KiB  
Article
Development of the Artificial Intelligence and Optical Sensing Methods for Oil Pollution Monitoring of the Sea by Drones
by Oleg Bukin, Dmitry Proschenko, Denis Korovetskiy, Alexey Chekhlenok, Viktoria Yurchik and Ilya Bukin
Appl. Sci. 2021, 11(8), 3642; https://0-doi-org.brum.beds.ac.uk/10.3390/app11083642 - 18 Apr 2021
Cited by 11 | Viewed by 3026
Abstract
The oil pollution of seas is increasing, especially in local areas, such as ports, roadsteads of the vessels, and bunkering zones. Today, methods of monitoring seawater are costly and applicable only in the case of big ecology disasters. The development of an operative [...] Read more.
The oil pollution of seas is increasing, especially in local areas, such as ports, roadsteads of the vessels, and bunkering zones. Today, methods of monitoring seawater are costly and applicable only in the case of big ecology disasters. The development of an operative and reasonable project for monitoring the sea surface for oil slick detection is described in this article using drones equipped with optical sensing and artificial intelligence. The monitoring system is implemented in the form of separate hard and soft frameworks (HSFWs) that combine monitoring methods, hardware, and software. Three frameworks are combined to fulfill the entire monitoring mission. HSFW1 performs the function of autonomous monitoring of thin oil slicks on the sea surface, using computer vision with AI elements for detection, segmentation, and classification of thin slicks. HSFW2 is based on the use of laser-induced fluorescence (LIF) to identify types of oil products that form a slick or that are in a dissolved state, as well as measure their concentration in solution. HSFW3 is designed for autonomous navigation and drone movement control. This article describes AI elements and hardware complexes of the three separate frameworks designed to solve the problems with monitoring slicks of oil products on the sea surface and oil products dissolved in seawater. The results of testing the HSFWs for the detection of pollution caused by marine fuel slicks are described. Full article
(This article belongs to the Special Issue Laser Sensing in Robotics)
Show Figures

Graphical abstract

29 pages, 12171 KiB  
Article
Low-Cost Calibration of Matching Error between Lidar and Motor for a Rotating 2D Lidar
by Chang Yuan, Shusheng Bi, Jun Cheng, Dongsheng Yang and Wei Wang
Appl. Sci. 2021, 11(3), 913; https://0-doi-org.brum.beds.ac.uk/10.3390/app11030913 - 20 Jan 2021
Cited by 9 | Viewed by 2070
Abstract
For a rotating 2D lidar, the inaccurate matching between the 2D lidar and the motor is an important error resource of the 3D point cloud, where the error is shown both in shape and attitude. Existing methods need to measure the angle position [...] Read more.
For a rotating 2D lidar, the inaccurate matching between the 2D lidar and the motor is an important error resource of the 3D point cloud, where the error is shown both in shape and attitude. Existing methods need to measure the angle position of the motor shaft in real time to synchronize the 2D lidar data and the motor shaft angle. However, the sensor used for measurement is usually expensive, which can increase the cost. Therefore, we propose a low-cost method to calibrate the matching error between the 2D lidar and the motor, without using an angular sensor. First, the sequence between the motor and the 2D lidar is optimized to eliminate the shape error of the 3D point cloud. Next, we eliminate the attitude error with uncertainty of the 3D point cloud by installing a triangular plate on the prototype. Finally, the Levenberg–Marquardt method is used to calibrate the installation error of the triangular plate. Experiments verified that the accuracy of our method can meet the requirements of the 3D mapping of indoor autonomous mobile robots. While we use a 2D lidar Hokuyo UST-10LX with an accuracy of ±40 mm in our prototype, we can limit the mapping error within ±50 mm when the distance is no more than 2.2996 m for a 1 s scan (mode 1), and we can limit the mapping error within ±50 mm at the measuring range 10 m for a 16 s scan (mode 7). Our method can reduce the cost while the accuracy is ensured, which can make a rotating 2D lidar cheaper. Full article
(This article belongs to the Special Issue Laser Sensing in Robotics)
Show Figures

Figure 1

Review

Jump to: Research

39 pages, 8902 KiB  
Review
A Survey of Low-Cost 3D Laser Scanning Technology
by Shusheng Bi, Chang Yuan, Chang Liu, Jun Cheng, Wei Wang and Yueri Cai
Appl. Sci. 2021, 11(9), 3938; https://0-doi-org.brum.beds.ac.uk/10.3390/app11093938 - 27 Apr 2021
Cited by 25 | Viewed by 5655
Abstract
By moving a commercial 2D LiDAR, 3D maps of the environment can be built, based on the data of a 2D LiDAR and its movements. Compared to a commercial 3D LiDAR, a moving 2D LiDAR is more economical. A series of problems need [...] Read more.
By moving a commercial 2D LiDAR, 3D maps of the environment can be built, based on the data of a 2D LiDAR and its movements. Compared to a commercial 3D LiDAR, a moving 2D LiDAR is more economical. A series of problems need to be solved in order for a moving 2D LiDAR to perform better, among them, improving accuracy and real-time performance. In order to solve these problems, estimating the movements of a 2D LiDAR, and identifying and removing moving objects in the environment, are issues that should be studied. More specifically, calibrating the installation error between the 2D LiDAR and the moving unit, the movement estimation of the moving unit, and identifying moving objects at low scanning frequencies, are involved. As actual applications are mostly dynamic, and in these applications, a moving 2D LiDAR moves between multiple moving objects, we believe that, for a moving 2D LiDAR, how to accurately construct 3D maps in dynamic environments will be an important future research topic. Moreover, how to deal with moving objects in a dynamic environment via a moving 2D LiDAR has not been solved by previous research. Full article
(This article belongs to the Special Issue Laser Sensing in Robotics)
Show Figures

Figure 1

Back to TopTop