sensors-logo

Journal Browser

Journal Browser

Integration of Advanced Sensors in Assistive Robotic Technology

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Electronic Sensors".

Deadline for manuscript submissions: closed (20 April 2022) | Viewed by 25976

Special Issue Editors


E-Mail Website
Guest Editor
1. Department of Mechanical Engineering, Laval University, Quebec City, QC G1V 0A6, Canada
2. Centre for Interdisciplinary Research in Rehabilitation and Social Integration (Cirris), Quebec City, QC G1V 0A6, Canada
Interests: rehabilitation engineering; assistive robotics; assistive technologies

E-Mail Website
Guest Editor
1. Department of Rehabilitation, Université Laval, Quebec City, QC G1V 0A6, Canada
2. Center for Interdisciplinary Research in Rehabilitation and Social Integration, CIUSSS de la Capitale-Nationale, Quebec City, QC G1M 2S8, Canada
Interests: wheelchairs; assistive technology; social participation; environmental factors; participatory research
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Assistive robotic technology has been shown to help people living with disabilities, by facilitating the execution of daily living tasks and increasing user independence. However, assistive robotic technologies can be complex, time-consuming and difficult to operate. 

Assistive robotic technology is defined here as devices that benefit people living with disabilities and older adults in the accomplishment of daily living tasks. This includes, but is not limited to, wheelchair-mounted assistive robotic arms, mobile servants, exoskeletons, prosthetics, orthotics, movement assistance, mobility aids and intelligent wheelchairs. 

Advanced sensors have the potential to improve the performance of assistive robotic technologies. In the field of control interfaces, innovations at the sensor level may help users to communicate their desires more efficiently to use the device to its full potential. In the field of system intelligence, innovations at the sensor level may help assistive robotic technologies to better understand their environment and facilitate user task, for instance, by autonomously avoiding an obstacle. 

For this Special Issue, innovation in advanced sensors is defined by the development of new sensors, of existing sensors that are augmented by artificial intelligence, and existing sensors that are newly applied to assistive robotic technologies.

Dr. Alexandre Campeau-Lecours
Prof. Dr. François Routhier
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Assistive technologies
  • Rehabilitation engineering
  • Development of new sensors, existing sensors that are augmented by artificial intelligence, existing sensors that are newly applied to assistive robotic technologies
  • Wheelchair mounted assistive robotic arms
  • Mobile servant
  • Exoskeletons
  • Prosthetics, orthotics
  • Movement assistance
  • Mobility aids and intelligent wheelchairs

Published Papers (8 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

21 pages, 19613 KiB  
Article
Efficient Self-Attention Model for Speech Recognition-Based Assistive Robots Control
by Samuel Poirier, Ulysse Côté-Allard, François Routhier and Alexandre Campeau-Lecours
Sensors 2023, 23(13), 6056; https://0-doi-org.brum.beds.ac.uk/10.3390/s23136056 - 30 Jun 2023
Viewed by 1096
Abstract
Assistive robots are tools that people living with upper body disabilities can leverage to autonomously perform Activities of Daily Living (ADL). Unfortunately, conventional control methods still rely on low-dimensional, easy-to-implement interfaces such as joysticks that tend to be unintuitive and cumbersome to use. [...] Read more.
Assistive robots are tools that people living with upper body disabilities can leverage to autonomously perform Activities of Daily Living (ADL). Unfortunately, conventional control methods still rely on low-dimensional, easy-to-implement interfaces such as joysticks that tend to be unintuitive and cumbersome to use. In contrast, vocal commands may represent a viable and intuitive alternative. This work represents an important step toward providing a viable vocal interface for people living with upper limb disabilities by proposing a novel lightweight vocal command recognition system. The proposed model leverages the MobileNet2 architecture, augmenting it with a novel approach to the self-attention mechanism, achieving a new state-of-the-art performance for Keyword Spotting (KWS) on the Google Speech Commands Dataset (GSCD). Moreover, this work presents a new dataset, referred to as the French Speech Commands Dataset (FSCD), comprising 4963 vocal command utterances. Using the GSCD as the source, we used Transfer Learning (TL) to adapt the model to this cross-language task. TL has been shown to significantly improve the model performance on the FSCD. The viability of the proposed approach is further demonstrated through real-life control of a robotic arm by four healthy participants using both the proposed vocal interface and a joystick. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

21 pages, 7142 KiB  
Article
Usability Evaluation of the SmartWheeler through Qualitative and Quantitative Studies
by Adina M. Panchea, Nathalie Todam Nguepnang, Dahlia Kairy and François Ferland
Sensors 2022, 22(15), 5627; https://0-doi-org.brum.beds.ac.uk/10.3390/s22155627 - 27 Jul 2022
Cited by 3 | Viewed by 1330
Abstract
Background: Intelligent powered wheelchairs remain a popular research topic that can improve users’ quality of life. Although our multidisciplinary research team has put a lot of effort into adding features based on end-users needs and impairments since 2006, there are still open issues [...] Read more.
Background: Intelligent powered wheelchairs remain a popular research topic that can improve users’ quality of life. Although our multidisciplinary research team has put a lot of effort into adding features based on end-users needs and impairments since 2006, there are still open issues regarding the usability and functionalities of an intelligent powered wheelchair (IPW). Methods: For this reason, this research presents an experience with our IPW followed by a study in two parts: a quantitative one based on the System Usability Scale (SUS) questionnaire and a qualitative one through open questions regarding IPW functionalities with novice users, e.g., IPW non-users. These users never used an IPW before, but are users and aware of the impacts of the technology used in our IPW, being undergraduate to postdoctoral students and staff (faculty, lecturers, research engineers) at the Faculty of Engineering of Université de Sherbrooke. Results: The qualitative analyses identified different behaviours among the novice users. The quantitative analysis via SUS questionnaire done with novice users reports an “okay” rating (equivalent with a C grade or 68 SUS Score) for our IPW’s usability. Moreover, advantages and disadvantages opinions were gathered on the IPW as well as comments which can be used to improve the system. Conclusions: The results reported in these studies show that the system, e.g., IPW, was judged to be sufficiently usable and robust by novice users, with and without experience with the software used in developing the IPW. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

22 pages, 10772 KiB  
Article
WISP, Wearable Inertial Sensor for Online Wheelchair Propulsion Detection
by Jhedmar Callupe Luna, Juan Martinez Rocha, Eric Monacelli, Gladys Foggea, Yasuhisa Hirata and Stéphane Delaplace
Sensors 2022, 22(11), 4221; https://0-doi-org.brum.beds.ac.uk/10.3390/s22114221 - 01 Jun 2022
Cited by 4 | Viewed by 2011
Abstract
Manual wheelchair dance is an artistic recreational and sport activity for people with disabilities that is becoming more and more popular. It has been reported that a significant part of the dance is dedicated to propulsion. Furthermore, wheelchair dance professionals such as Gladys [...] Read more.
Manual wheelchair dance is an artistic recreational and sport activity for people with disabilities that is becoming more and more popular. It has been reported that a significant part of the dance is dedicated to propulsion. Furthermore, wheelchair dance professionals such as Gladys Foggea highlight the need for monitoring the quantity and timing of propulsions for assessment and learning. This study addresses these needs by proposing a wearable system based on inertial sensors capable of detecting and characterizing propulsion gestures. We called the system WISP. Within our initial configuration, three inertial sensors were placed on the hands and the back. Two machine learning classifiers were used for online bilateral recognition of basic propulsion gestures (forward, backward, and dance). Then, a conditional block was implemented to rebuild eight specific propulsion gestures. Online paradigm is intended for real-time assessment applications using sliding window method. Thus, we evaluate the accuracy of the classifiers in two configurations: “three-sensor” and “two-sensor”. Results showed that when using “two-sensor” configuration, it was possible to recognize the propulsion gestures with an accuracy of 90.28%. Finally, the system allows to quantify the propulsions and measure their timing in a manual wheelchair dance choreography, showing its possible applications in the teaching of dance. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

13 pages, 2459 KiB  
Article
A Systematic Study on Electromyography-Based Hand Gesture Recognition for Assistive Robots Using Deep Learning and Machine Learning Models
by Pranesh Gopal, Amandine Gesta and Abolfazl Mohebbi
Sensors 2022, 22(10), 3650; https://0-doi-org.brum.beds.ac.uk/10.3390/s22103650 - 11 May 2022
Cited by 16 | Viewed by 7671
Abstract
Upper limb amputation severely affects the quality of life and the activities of daily living of a person. In the last decade, many robotic hand prostheses have been developed which are controlled by using various sensing technologies such as artificial vision and tactile [...] Read more.
Upper limb amputation severely affects the quality of life and the activities of daily living of a person. In the last decade, many robotic hand prostheses have been developed which are controlled by using various sensing technologies such as artificial vision and tactile and surface electromyography (sEMG). If controlled properly, these prostheses can significantly improve the daily life of hand amputees by providing them with more autonomy in physical activities. However, despite the advancements in sensing technologies, as well as excellent mechanical capabilities of the prosthetic devices, their control is often limited and usually requires a long time for training and adaptation of the users. The myoelectric prostheses use signals from residual stump muscles to restore the function of the lost limbs seamlessly. However, the use of the sEMG signals in robotic as a user control signal is very complicated due to the presence of noise, and the need for heavy computational power. In this article, we developed motion intention classifiers for transradial (TR) amputees based on EMG data by implementing various machine learning and deep learning models. We benchmarked the performance of these classifiers based on overall generalization across various classes and we presented a systematic study on the impact of time domain features and pre-processing parameters on the performance of the classification models. Our results showed that Ensemble learning and deep learning algorithms outperformed other classical machine learning algorithms. Investigating the trend of varying sliding window on feature-based and non-feature-based classification model revealed interesting correlation with the level of amputation. The study also covered the analysis of performance of classifiers on amputation conditions since the history of amputation and conditions are different to each amputee. These results are vital for understanding the development of machine learning-based classifiers for assistive robotic applications. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

17 pages, 7614 KiB  
Article
Classification of the Sidewalk Condition Using Self-Supervised Transfer Learning for Wheelchair Safety Driving
by Ha-Yeong Yoon, Jung-Hwa Kim and Jin-Woo Jeong
Sensors 2022, 22(1), 380; https://0-doi-org.brum.beds.ac.uk/10.3390/s22010380 - 05 Jan 2022
Cited by 8 | Viewed by 2488
Abstract
The demand for wheelchairs has increased recently as the population of the elderly and patients with disorders increases. However, society still pays less attention to infrastructure that can threaten the wheelchair user, such as sidewalks with cracks/potholes. Although various studies have been proposed [...] Read more.
The demand for wheelchairs has increased recently as the population of the elderly and patients with disorders increases. However, society still pays less attention to infrastructure that can threaten the wheelchair user, such as sidewalks with cracks/potholes. Although various studies have been proposed to recognize such challenges, they mainly depend on RGB images or IMU sensors, which are sensitive to outdoor conditions such as low illumination, bad weather, and unavoidable vibrations, resulting in unsatisfactory and unstable performance. In this paper, we introduce a novel system based on various convolutional neural networks (CNNs) to automatically classify the condition of sidewalks using images captured with depth and infrared modalities. Moreover, we compare the performance of training CNNs from scratch and the transfer learning approach, where the weights learned from the natural image domain (e.g., ImageNet) are fine-tuned to the depth and infrared image domain. In particular, we propose applying the ResNet-152 model pre-trained with self-supervised learning during transfer learning to leverage better image representations. Performance evaluation on the classification of the sidewalk condition was conducted with 100% and 10% of training data. The experimental results validate the effectiveness and feasibility of the proposed approach and bring future research directions. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

18 pages, 30866 KiB  
Article
Automated Curb Recognition and Negotiation for Robotic Wheelchairs
by Sivashankar Sivakanthan, Jeremy Castagno, Jorge L. Candiotti, Jie Zhou, Satish Andrea Sundaram, Ella M. Atkins and Rory A. Cooper
Sensors 2021, 21(23), 7810; https://0-doi-org.brum.beds.ac.uk/10.3390/s21237810 - 24 Nov 2021
Cited by 3 | Viewed by 3014
Abstract
Common electric powered wheelchairs cannot safely negotiate architectural barriers (i.e., curbs) which could injure the user and damage the wheelchair. Robotic wheelchairs have been developed to address this issue; however, proper alignment performed by the user is needed prior to negotiating curbs. Users [...] Read more.
Common electric powered wheelchairs cannot safely negotiate architectural barriers (i.e., curbs) which could injure the user and damage the wheelchair. Robotic wheelchairs have been developed to address this issue; however, proper alignment performed by the user is needed prior to negotiating curbs. Users with physical and/or sensory impairments may find it challenging to negotiate such barriers. Hence, a Curb Recognition and Negotiation (CRN) system was developed to increase user’s speed and safety when negotiating a curb. This article describes the CRN system which combines an existing curb negotiation application of a mobility enhancement robot (MEBot) and a plane extraction algorithm called Polylidar3D to recognize curb characteristics and automatically approach and negotiate curbs. The accuracy and reliability of the CRN system were evaluated to detect an engineered curb with known height and 15 starting positions in controlled conditions. The CRN system successfully recognized curbs at 14 out of 15 starting positions and correctly determined the height and distance for the MEBot to travel towards the curb. While the MEBot curb alignment was 1.5 ± 4.4°, the curb ascending was executed safely. The findings provide support for the implementation of a robotic wheelchair to increase speed and reduce human error when negotiating curbs and improve accessibility. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

13 pages, 5047 KiB  
Article
Comparison of Manual Wheelchair and Pushrim-Activated Power-Assisted Wheelchair Propulsion Characteristics during Common Over-Ground Maneuvers
by Mahsa Khalili, Garrett Kryt, W. Ben Mortenson, Hendrik F. Machiel Van der Loos and Jaimie Borisoff
Sensors 2021, 21(21), 7008; https://0-doi-org.brum.beds.ac.uk/10.3390/s21217008 - 22 Oct 2021
Cited by 6 | Viewed by 3191
Abstract
Pushrim-activated power-assisted wheels (PAPAWs) are assistive technologies that use force sensor data to provide on-demand propulsion assistance to manual wheelchair users. However, available data about kinetic and kinematic of PAPAW use are mainly limited to experiments performed on a treadmill or using a [...] Read more.
Pushrim-activated power-assisted wheels (PAPAWs) are assistive technologies that use force sensor data to provide on-demand propulsion assistance to manual wheelchair users. However, available data about kinetic and kinematic of PAPAW use are mainly limited to experiments performed on a treadmill or using a dynamometer. In this work, we performed experiments to gather kinetics of wheelchair propulsion and kinematics of wheelchair motion for a variety of over-ground wheelchair maneuvers with a manual wheelchair with and without PAPAWs. Our findings revealed that using PAPAWs can significantly reduce the propulsion effort and push frequency. Both linear and angular velocities of the wheelchair were significantly increased when using PAPAWs. Less force and push frequency could potentially reduce risk of chronic upper limb injury. Higher linear velocity could be desirable for various daily life activities; however; the increase in the angular velocity could lead to unintended deviations from a desired path. Future research could investigate PAPAW controllers that amplify the desired intentions of users while mitigating any unwanted behaviours. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

Review

Jump to: Research

17 pages, 996 KiB  
Review
Current Trends and Challenges in Pediatric Access to Sensorless and Sensor-Based Upper Limb Exoskeletons
by Guillaume Gaudet, Maxime Raison and Sofiane Achiche
Sensors 2021, 21(10), 3561; https://0-doi-org.brum.beds.ac.uk/10.3390/s21103561 - 20 May 2021
Cited by 7 | Viewed by 3596
Abstract
Sensorless and sensor-based upper limb exoskeletons that enhance or support daily motor function are limited for children. This review presents the different needs in pediatrics and the latest trends when developing an upper limb exoskeleton and discusses future prospects to improve accessibility. First, [...] Read more.
Sensorless and sensor-based upper limb exoskeletons that enhance or support daily motor function are limited for children. This review presents the different needs in pediatrics and the latest trends when developing an upper limb exoskeleton and discusses future prospects to improve accessibility. First, the principal diagnoses in pediatrics and their respective challenge are presented. A total of 14 upper limb exoskeletons aimed for pediatric use were identified in the literature. The exoskeletons were then classified as sensorless or sensor-based, and categorized with respect to the application domain, the motorization solution, the targeted population(s), and the supported movement(s). The relative absence of upper limb exoskeleton in pediatrics is mainly due to the additional complexity required in order to adapt to children’s growth and answer their specific needs and usage. This review highlights that research should focus on sensor-based exoskeletons, which would benefit the majority of children by allowing easier adjustment to the children’s needs. Sensor-based exoskeletons are often the best solution for children to improve their participation in activities of daily living and limit cognitive, social, and motor impairments during their development. Full article
(This article belongs to the Special Issue Integration of Advanced Sensors in Assistive Robotic Technology)
Show Figures

Figure 1

Back to TopTop