Physical Human-Robot Interaction and Robot Manipulation

A special issue of Electronics (ISSN 2079-9292). This special issue belongs to the section "Systems & Control Engineering".

Deadline for manuscript submissions: closed (31 August 2023) | Viewed by 5429

Special Issue Editors


E-Mail Website
Guest Editor
Dipartimento di Ingegneria, Università Degli Studi Della Campania “Luigi Vanvitelli”, Via Roma 29, 81031 Aversa, CE, Italy
Interests: robotics; robotic manipulation; tactile/force sensors; autonomous driving, control systems; embedded systems

E-Mail Website
Guest Editor
Dipartimento di Ingegneria, Università degli Studi della Campania “Luigi Vanvitelli”, Via Roma, 29, 81031 Aversa, CE, Italy
Interests: sensors for robotics applications; control; human–robot interaction
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear colleagues,

While robotics scientists are moving toward the creation of humanoid robots thought to be able to work alongside humans in human-centric and unstructured environments, industries are faced with even more complex tasks that require important technological advances and, in particular, human-like features in the new generation of robotic systems. Whether they are designed to work with humans, e.g., in homes, schools and hospitals, performing high-level tasks, or designed to replace them in repetitive manufacturing tasks or hazardous and high-risk environments, robots should be able to perceive the external world in order to perform such challenging applications and in order to interact with the external environment in a safe manner.

This Special Issue collects the experience of academic and industrial scientists in order to discuss and present state-of-the-art technologies used to confront the most relevant problems in human–robot interaction application and autonomous object grasping and manipulation.

Potential submission topics include but are not limited to the following:

  • Tactile/Force Sensor Technology;
  • Vision Sensory Systems;
  • Object Recognition;
  • Object Grasping and Manipulation;
  • Human-Robot Cooperative Systems;
  • Human-Robot Interaction Control Algorithm;
  • Trajectory Planner Algorithm for Grasping and Unintended Collision Avoidance.

Dr. Andrea Cirillo
Dr. Salvatore Pirozzi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Electronics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • grasping
  • manipulation
  • object manipulation
  • human–robot Cooperation
  • human–robot interaction
  • smart sensors
  • planning and obstacle avoidance

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 1671 KiB  
Article
CORB2I-SLAM: An Adaptive Collaborative Visual-Inertial SLAM for Multiple Robots
by Arindam Saha, Bibhas Chandra Dhara, Saiyed Umer, Ahmad Ali AlZubi, Jazem Mutared Alanazi and Kulakov Yurii
Electronics 2022, 11(18), 2814; https://0-doi-org.brum.beds.ac.uk/10.3390/electronics11182814 - 06 Sep 2022
Cited by 8 | Viewed by 2009
Abstract
The generation of robust global maps of an unknown cluttered environment through a collaborative robotic framework is challenging. We present a collaborative SLAM framework, CORB2I-SLAM, in which each participating robot carries a camera (monocular/stereo/RGB-D) and an inertial sensor to run odometry. A centralized [...] Read more.
The generation of robust global maps of an unknown cluttered environment through a collaborative robotic framework is challenging. We present a collaborative SLAM framework, CORB2I-SLAM, in which each participating robot carries a camera (monocular/stereo/RGB-D) and an inertial sensor to run odometry. A centralized server stores all the maps and executes processor-intensive tasks, e.g., loop closing, map merging, and global optimization. The proposed framework uses well-established Visual-Inertial Odometry (VIO), and can be adapted to use Visual Odometry (VO) when the measurements from inertial sensors are noisy. The proposed system solves certain disadvantages of odometry-based systems such as erroneous pose estimation due to incorrect feature selection or losing track due to abrupt camera motion and provides a more accurate result. We perform feasibility tests on real robot autonomy and extensively validate the accuracy of CORB2I-SLAM on benchmark data sequences. We also evaluate its scalability and applicability in terms of the number of participating robots and network requirements, respectively. Full article
(This article belongs to the Special Issue Physical Human-Robot Interaction and Robot Manipulation)
Show Figures

Figure 1

19 pages, 1224 KiB  
Article
Analysis of a User Interface Based on Multimodal Interaction to Control a Robotic Arm for EOD Applications
by Denilson V. Goyzueta, Joseph Guevara M., Andrés Montoya A., Erasmo Sulla E., Yuri Lester S., Pari L. and Elvis Supo C.
Electronics 2022, 11(11), 1690; https://0-doi-org.brum.beds.ac.uk/10.3390/electronics11111690 - 25 May 2022
Cited by 7 | Viewed by 2740
Abstract
A global human–robot interface that meets the needs of Technical Explosive Ordnance Disposal Specialists (TEDAX) for the manipulation of a robotic arm is of utmost importance to make the task of handling explosives safer, more intuitive and also provide high usability and efficiency. [...] Read more.
A global human–robot interface that meets the needs of Technical Explosive Ordnance Disposal Specialists (TEDAX) for the manipulation of a robotic arm is of utmost importance to make the task of handling explosives safer, more intuitive and also provide high usability and efficiency. This paper aims to evaluate the performance of a multimodal system for a robotic arm that is based on Natural User Interface (NUI) and Graphical User Interface (GUI). The mentioned interfaces are compared to determine the best configuration for the control of the robotic arm in Explosive Ordnance Disposal (EOD) applications and to improve the user experience of TEDAX agents. Tests were conducted with the support of police agents Explosive Ordnance Disposal Unit-Arequipa (UDEX-AQP), who evaluated the developed interfaces to find a more intuitive system that generates the least stress load to the operator, resulting that our proposed multimodal interface presents better results compared to traditional interfaces. The evaluation of the laboratory experiences was based on measuring the workload and usability of each interface evaluated. Full article
(This article belongs to the Special Issue Physical Human-Robot Interaction and Robot Manipulation)
Show Figures

Graphical abstract

Back to TopTop