Next Article in Journal
Foot-Worn Inertial Sensors Are Reliable to Assess Spatiotemporal Gait Parameters in Axial Spondyloarthritis under Single and Dual Task Walking in Axial Spondyloarthritis
Next Article in Special Issue
A Novel GAN-Based Synthesis Method for In-Air Handwritten Words
Previous Article in Journal
An sEMG-Controlled 3D Game for Rehabilitation Therapies: Real-Time Time Hand Gesture Recognition Using Deep Learning Techniques
Previous Article in Special Issue
British Sign Language Recognition via Late Fusion of Computer Vision and Leap Motion with Transfer Learning to American Sign Language
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Efficient Upper Limb Position Estimation Based on Angular Displacement Sensors for Wearable Devices

by
Aldo-Francisco Contreras-González
1,
Manuel Ferre
1,
Miguel Ángel Sánchez-Urán
1,2,*,
Francisco Javier Sáez-Sáez
1 and
Fernando Blaya Haro
2
1
Centro de Automática y Robótica (CAR) UPM-CSIC, ETS Ingenieros Industriales, Universidad Politécnica de Madrid, Calle de José Gutiérrez Abascal, 2, 28006 Madrid, Spain
2
ETS Ingeniería y Diseño Industrial, Universidad Politécnica de Madrid, Ronda de Valencia, 3, 28012 Madrid, Spain
*
Author to whom correspondence should be addressed.
Submission received: 9 October 2020 / Revised: 7 November 2020 / Accepted: 9 November 2020 / Published: 12 November 2020
(This article belongs to the Special Issue Sensor Systems for Gesture Recognition)

Abstract

:
Motion tracking techniques have been extensively studied in recent years. However, capturing movements of the upper limbs is a challenging task. This document presents the estimation of arm orientation and elbow and wrist position using wearable flexible sensors (WFSs). A study was developed to obtain the highest range of motion (ROM) of the shoulder with as few sensors as possible, and a method for estimating arm length and a calibration procedure was proposed. Performance was verified by comparing measurement of the shoulder joint angles obtained from commercial two-axis soft angular displacement sensors (sADS) from Bend Labs and from the ground truth system (GTS) OptiTrack. The global root-mean-square error (RMSE) for the shoulder angle is 2.93 degrees and 37.5 mm for the position estimation of the wrist in cyclical movements; this measure of RMSE was improved to 13.6 mm by implementing a gesture classifier.

1. Introduction

In the fields of biomechanics, physiotherapy and kinesiology, motion capture plays a fundamental role in the study of the measurement of physical activity, either to know the performance of a device or the evolution of a therapy. Optical motion capture is one of the most accurate methods for measurement of human kinematics [1]. In addition to the fact that the use of this technology is expensive, it limits mobility to specific areas and, in most cases, bulky markers and special suits must be carried by the user. Outside of a controlled environment, a common solution is the use of inertial measurements units (IMUs), which in some cases are small and wireless, and are capable of obtaining sample measurements at high speeds [2]. These sensors may be suitable for short periods of time where the accumulated error is not significant. Nevertheless, a high demand of hardware resources is used for data processing as filtering, integration and trigonometry operations are involved to estimate joint angular kinematics [3,4]. Users should also consider movement complexity, sensor placement, the studied joint, biomechanical models used and calibration procedure.
The era of flexible systems is on the rise, and a new generation of motion sensors is emerging. The wearable flexible sensors (WFSs) are created with flexible materials which are inexpensive to manufacture, have better mechanical and thermal properties than non-flexible sensors and are lightweight and comfortable for motion capture [5]. Still, there are important features to consider, such as stretching, compression, bending and twisting [6]. With the use of wearable systems, there is a promising path to contribute to a meaningful diagnosis of shoulder conditions, as well as a concise follow-up for rehabilitation evolution [7]. Upper extremities of the human body have complex kinematics and contain a large number of degrees of freedom (DOF) [8]; the glenohumeral (shoulder) joint, for instance, is a complex joint with more than three DOF [9]. WFSs could properly estimate the kinematics of the shoulder due to its physical characteristics by eliminating the use of rigid parts and even communication cables in some cases [10,11,12,13]. It has been proved that it is possible to estimate the angles of the arm with respect to the trunk by placing sensors on the body [14,15,16,17]. Additionally, the use of compression jackets, combined with soft sensor arrays, makes it possible to accurately estimate the position of the limb [18,19], as well as to place the sensors onto the skin without the use of adhesives or other preparations such as invasive intrusions in the user’s body. However, a disadvantage of compression garments is that sweat can cause damage when it comes into contact with electronic devices.
Work-related studies using IMU systems for shoulder orientation estimation reported a RMSE of 15 for flexion/extension movements, a RMSE less than 20 for abduction/adduction movement and from 1 to 60 for rotation movement [20]. The combination of IMU with electromyography sensors (EMGs) shows a RMSE of 10.24 in complex tasks performed with the shoulder [21]. Another study using an IMU system as reference and a smart-textile with printed strain sensors showed a mean error of 9.6 in planar motions measurements of shoulder joint [22]. In the case of lower limbs using WFSs, the maximum RMS error was nearly 15 for the knee sensors [23]. The authors of [24] developed a dynamic measurement system for the movement of the legs, which can detect the squat position with an accuracy of 3 and walking with an accuracy of 5 . WFSs made of elastomers were used for sensing hand gestures, obtaining an error of 5 [25], which is acceptable to the majority of requirements of motion capture.
In robotics, the end-effector position is calculated using kinematics equations by knowing the angles of each joint and the length of each link; this means that, to estimate the position of the wrist, the angles of the arm and forearm and its lengths must be known. In this work, the goal is to sense four DOF in the arm [26,27], three DOF in the shoulder and one in the elbow. In addition, a kinematic algorithm is proposed to find the position of the elbow and wrist. These, using mainly the signals from the WFSs and for one single DOF, constitute the IMU system. This document describes the method to recreate a WFS to obtain orientation and position for the upper limbs, so forth applied to the human right arm. A calibration algorithm to estimate the length of the arm and forearm of each subject and an algorithm for the estimation of the position of the elbow and the wrist from the shoulder are proposed. In Section 3, the placement of the sensors and the required calibration for each user are described. Fusion data methods and the filters applied to the resulting signals are discussed. The device presented here eliminates the rigid elements that might interfere in daily tasks and movements, allowing the user freedom and mobility. Furthermore, compared with previous work by the authors, described in Section 2, the number of sensors was reduced, and the computation cost was improved.

2. Previous Work by the Authors

It was proved that it is possible to obtain 95% of the variance of the main components for shoulder gestures with an array of seven single-axis resistive sensors [19]. Other results showed that it is possible to estimate the gestures of the shoulder with a performance 95.4% using an array of four WFSs and EMG signals [28]. In this work, a rigorous extension was performed using the two-axis sADS whose operation is explained and detailed in [29,30]. Features of the sADS include linear and bidirectional response, measurement of angular displacement in two orthogonal planes with a sensitivity of 0.016 and a repeatability of 0.18 . The capacitive sensors are made using layered medical grade silicone elastomers doped with conductive and nonconductive fillers.
Initially, four sADS were placed in the intermediate positions of the seven single-axis resistive sensors configuration recommended by [19]. Replicating the proposed 20-layer hidden neural network method, with a configuration for the acquired data from the four sADS of [70%, 15%, 15%] for training, cross-validation and test stages, respectively, an overfitting was identified. In order to lessen the chance of overfitting, a principal component analysis (PCA) was performed; the sensor placed over the middle deltoid muscle was found to contain the most activity on pure abduction and adduction movements, while the sensors at the rear, placed over the posterior deltoid, were representative for horizontal adduction and flexion and extension movements. A 92% representation of the ROM was obtained using only two sADS; the study and location of each sensor is detailed in the following subsection.

sADS Array Location

Three different arrangements were made in different locations at the shoulder as shown in Figure 1. The arrangement resulting from the analyses was the placement of two rotated sADS 90 between two planes that represent 92% of the variance of the data (See Table 1).

3. System Overview

This study was performed on a single healthy limb without reduced mobility, the right arm. However, it can be replicated for both arms. The areas considered in the estimation started from the centre of the head of the humerus to the centre of the junction of the radius and the ulna at the wrist. The supination and pronation gestures were not considered.

3.1. Working Range

The arm at rest was considered as the natural pose, making a full extension of the elbow as shown in Figure 2a.
The zero-position of the system occurred when a pure shoulder abduction gesture reaching 90 (without performing any flexion or extension of the shoulder) and a full extension of the elbow were performed, starting from the natural pose.

3.2. Sensors Placement

Placing sensors on a compression jacket presents various difficulties, the most important being wrinkles and folds in the fabric and skin tightening in specific areas. To eliminate such problems, two small rigid parts were designed to pose and guide the two-axis sADS of BendLabs [30]. One of them holds the sensor in a fixed location and the other allows it to slide inside it, not only eliminating the problem of skin stretching but also creating a straight line formed by the limb, which allows the accurate measurement of the angle (see Appendix A.1). The pieces are sewn on the shirt and do not impede the free mobility of the user in any way.
As a result of the PCA for the location of the sADS on the shoulder, the best arrangement was selected with respect to the number of sensors and the representation of movements performed. It is worth mentioning that the locations of each sensor were initially placed in the way recommended by the previous study. The adjustments made to the final positions were identified when making several data captures for subsequent analysis. The location description is detailed on Appendix A.2.
The IMUs were placed on opposite sides of the arm (Figure 3). IMU 1 was located just where the shoulder ends and IMU 2 was located just before the beginning of the elbow. The orientation of the IMUs should be towards the same direction of the arm.

3.3. Shoulder Rotation

As the study presented in this document was implemented in an Exosuit developed in our group [31], these solutions were discarded for the first two DOF (abduction/adduction and flexion/extension). The third DOF of the shoulder (Figure 2e medial/lateral rotation) is revealed at the forearm and can be measured at the elbow. Given that the shoulder sADS array does not detect the rotation gesture, the use of IMU is proposed exclusively for this gesture.
Two IMUs were placed on the arm (same link) as shown in Figure 3. The rotation was calculated as the difference of the rotation angles on the x-axis of the IMU. This angle was generated by the shoulder rotation, thus completing the variables necessary to estimate the elbow and wrist position δ 1 .
δ 1 = Δ I M U 1 y a w , I M U 2 y a w

3.4. Ground Truth

To verify the estimation proposed in this article, the OptiTrack system was used for ground truth data. A total of 10 reflective pointers and four cameras from the OptiTrack system [32] were placed, providing submillimetre precision. The markers placement generated straight lines between each section to be analysed. Two markers were placed on the back to know the inclination of the user—two on the shoulder (just above the fixed parts of the shoulder sADS), two on the arm, two on the forearm, one on the elbow and two on the wrist [33], as shown in Figure 3.

3.5. Data Acquisition

The three sADS were connected to a custom acquisition board based on the LAUNCHXL-F28379D development board and communicated with the micro-controller via I2C protocol at a frequency of 200 Hz. The IMUs and the custom acquisition board were connected to a NVIDIA Jetson Nano via LpBUS and SPI at 400 Hz and 500 Hz, respectively (Figure 4). The LPMS-URS2 9-axis IMU manufactured by LP-RESEARCH, with a Kalman processing stage on board and gyroscope, accelerometer and magnetometer, was selected. This sensor delivers filtered data as output at frequencies up to 400 Hz with a resolution of 0.01. Software was developed on the Jetson Nano board to trigger sensors and store all sensor data along with a timestamp.
To perform data collection for session testing, the compression jacket with sADS and IMUS was placed on the subject. Then, OptiTrack markers were located as shown in Figure 3 in order to obtain the real position of the subject’s arm. Both systems started the data capture operation almost at the same time, first in the OptiTrack software and then in the Jetson Nano. The Jetson Nano stored the following in a file (with the start time in milliseconds): the six angles received from the sADS, the Euler angles, the quaternions and a timestamp of each of the two IMUs and the gesture performed in that timestamp, in addition to a timestamp generated by the Jetson Nano itself. The OptiTrack software stored the points in coordinates x, y and z in the space of each reflective pointer, whose name is the start time in milliseconds, in a file. To synchronise the data of the files of each session (Appendix A.3), the start time and the sampling frequency in each file were considered.

3.6. sADS Behaviour

The angular displacement of the sADS was obtained by a differential capacitance measurement. The sensor repeatability, according to the manufacturer, is 0.18 . However, when taking measures from the sensor at 200 Hz, a noise of a maximum amplitude near 3 was observed. A set of sensor samples obtained while bending the sensor was stored and a frequency domain analysis was realised over those samples. Following frequency spectrum analysis, a low-pass first order infinite impulse response (IIR) digital filter with a cut-off frequency of 10 Hz was applied to the raw data provided by the sensor to reduce noise. Finally, a ± 0.15 dead zone for sampling data was added to the filter (see Figure 5).

4. Estimation Methods

4.1. Angle Fusion Algorithm

The fusion of the sensor signals is done by using the four angles given by two sADS sensors. Sensor positions and definition of planes are described in Section 3.1. To find the angle in the XY plane, the xy upper sADS and xz back sADS angles are used: for the XZ plane, the xz upper sADS and xy back sADS angles are needed (see Table 2). Data fusion algorithm is detailed on Appendix B.1.

4.2. Method to Estimate Arm and Forearm Length

An important error for the calculation of the position in space of the wrist and elbow with respect to the shoulder—the use of a measuring tape to determine the length of each link (length of the arm and forearm)—was detected. In order to reduce this error, an L-shaped structure was proposed. This calibration method consists of a plane marked by four points at known distances to estimate arm and forearm lengths by trigonometric methods.
This structure was created with the purpose of marking a fixed distance from the shoulder to a point in space, and it was arranged in four points: two fixed and two movable; over each one of the four points, there was a vertical pin with a magnet on top of it, which paired with a bracelet for the user that locates the magnet in the centre of the wrist. By performing a series of defined movements (Appendix B.2), the ADS sensors were calibrated and the arm length was calculated with trigonometric equations.

4.3. Elbow and Wrist Position Estimation

To estimate the position of the shoulder and the wrist, it is necessary to know the angles α 1 (Plane X Y ), ϕ 1 (Plane X Z ), the rotation of the shoulder δ 1 (Equation (1)) and the elbow flexion angle α 2 , as well as the distances of each segment: shoulder to elbow (Equation (A16) a r m l e n g t h ) and elbow to wrist (Equation (A17) f o r e a r m l e n g t h ). The kinematic model considered in this work consists of 4 rotating joints. The first three joints take into account the glenohumeral joint. It is commonly accepted that said articulation contains two translational DOF. However, since the surfaces of the head of the humerus are more than 99% spherical [34], it is modelled as a spherical joint. The fourth joint represents the elbow, where pronation and supination are neglected. Finally, to estimate the position of the arm and elbow, the Denavit–Hartenberg notation is used (Appendix B.4).

5. Experiment

The experiment depended on two scenarios. The first one consisted of a table, a chair and the structure described in Section 4.2 in order to perform the length estimation (system calibration). The second one was needed to acquire the positions with the ground truth system (GTS) and to perform the analysis of acquired data (ground truth pose). A graphical user interface developed on the Jetson Nano board allowed calibrations to be performed by saving the values for each user and the gesture performed and guiding the speed and kind of movement of the participants while performing the gestures. Once the estimates of arm and forearm length were saved, these data were used for position estimation. In this document, all the acquired data that were stored to be analysed by the MATLAB R2020a software tool.
The study involved nine subjects: two females and seven males for both scenarios. The subjects were aged 27 ± 4.4 years (mean ± standard deviation (SD)), with a body weight of 81.7 ± 15.6 kg and a height of 1.83 ± 0.16 m. All participants had no evidence or known history of skeletal and neurological diseases, and they exhibited normal joint ROM and muscle strength. All experimental procedures were carried out in accordance with the Declaration of Helsinki on research involving human subjects and approved by the ethical committee of the Polytechnic University of Madrid.

5.1. System Calibration

The arm calibration algorithm (Section 4.2) was tested with nine subjects, measuring the arm length (from shoulder flexion point to elbow flexion point) and the forearm length (from the point of flexion of the elbow to the centre of the point of flexion of the wrist) with a tape measure and placing the reflectors of the OptiTrack ground truth system (GTS) at each point (see Table 3).
The angles were captured by the interface, which displayed them on screen along with the time of capture and the estimation of arm and forearm length, obtained using the Equations (A14) and (A15) described in Appendix B.2 with a distance A B ¯ = 39.5 cm. The calibration frame is shown in Figure 6. Then, the distances estimated were compared with the GTS.
The performance of the frame developed is shown on Table 4 and presents an RMSE of [0.540, 2.280 mm] (best, worse case) for the arm and RMSE of [0.200, 3.130 mm] for the forearm.

5.2. Ground Truth Pose

The tests were carried out in four different sessions and the data acquired in the tests were analysed later. The first session consisted of a sequence of five abductions/adductions, five flexions/extensions and five horizontal abductions. This session was performed with visual feedback through a graphical interface projected on a monitor, where the time of each gesture (e.g., flexion) was four seconds with one second of rest—that is, each complete movement (e.g., flexion + extension) had a duration of ten seconds. In the second session, free movements were performed for 30 s through the lower shoulder area—that is, only obtaining negative values according to the range of work shown in Figure 2. The third session consisted of making free movements in the upper area. In the fourth session, free movements were performed throughout the entire working range. The subject performed each whole session five times.
The shoulder, elbow and wrist markers were used to make a comparison of the position of the GTS with the estimation made by the proposed algorithm. The back, arm, forearm and shoulder markers were used to find the angles generated by the poses. The shoulder marker was placed on a 12 mm raised pin on the z-axis; therefore, this elevation was subtracted to make the comparison, this being the reference point (P) of the system ( X 0 , Y 0 , Z 0 ). The pose of the other markers was defined by subtracting the reference point from the position of each marker given by the GTS—that is, x n = x n G T x 0 .

5.3. Orientation Method

The performance of the orientation estimation was evaluated offline using the RMSE for two aspects: the angles acquired by the sADS and the position estimation of the elbow and wrist compared to the GTS, obtained with the L-shaped structure (Figure 7). The estimation of the elbow and shoulder position depended on the angles measured by the sADS in the experiments. The position of the GTS markers was used to find the error of the angles. The angle between the markers of the OptiTrack system was obtained by calculating the angle generated between markers on the arm (placed on the shoulder acromion bone) and the ones on the back. The data from the experiments were classified into cyclical movements, free movements, movements under the shoulder, movements over the shoulder and each gesture performed. The angles acquired by the different sensors and the angles generated by the location of the markers were compared.
It can be seen in Table 5 that free movements and movements carried out over the shoulder had a greater error than cyclical movements and those carried out below the shoulder. In addition, it was observed that the sensor placed on the back measured the gestures of flexion/extension and abduction/adduction more easily, while the sensor placed on the top was a better solution for horizontal adduction movements and shoulder movements.
Using the algorithm proposed in the Section 4.3 and the angles captured from the elbow and wrist experiments, it was possible to compare the positions with the GTS. The largest error obtained in the entire experiment was 124.3 mm, which corresponded to the position of the wrist in a free movement exercise on the shoulder, and the smallest error was 36 mm, which corresponded to the elbow in a cyclical movement below the shoulder. Table 6 shows the RMSE of all captured movements classified in the elbow and wrist in free and cyclical movements, above and below the shoulder.
The mean error of the position estimate was 43.4 mm for the elbow and 57.2 mm for the wrist in all three axes. If the length of the arm of an average adult is 300 mm, the error would be equivalent to 9.53% of the length from shoulder to the elbow, or 14.47% from the shoulder to the wrist and, in the worst case up to 27%. This means that, when extending the arm entirely, it can be said that the error is as large as half of the palm of an adult’s hand in space.
To improve performance in cyclical movements, a classifier was developed; the classifier identifies the gestures performed and also identifies the movements that correspond to the lower and upper area with respect to the shoulder. The classifier was generated by making a comparison with 24 different methods (see Table A2 in Appendix C.1). Once the best method was obtained and trained, a function was created in MATLAB 2020a using the upper sADS and back sADS angles to classify the motion.
After the classification algorithm found which gesture the movement belonged to, the method described in Appendix C.2 was used, applying a different “weight” for each case, through Equation (A11). The weights (Table 7) were found through an empirical method using the data of the participants of the first test.
By evaluating the fusion algorithm with the gesture classifier, an overall improvement in angle measurement was observed. This algorithm used the sensor signal that performed best (upper or back sensor) for each type of gesture, resulting in the smallest measurement error of the shoulder angles. The RMSE for cyclical movements was 2.93 . In the case of free movements, the algorithm was allowed to interpret each signal from the sADS sensors and classify the movements, obtaining a significant improvement in angles with an RMSE of 3.71 . Because the mean angle error decreased, the position estimate improved (Table 8). It should be mentioned that the largest error obtained for the position estimation using the weighted fusion algorithm was 60.7 mm, 43% less than the largest error without the weighting. The smallest was 12 mm, only 19 mm better than the smallest without the weighting.
The overall RMSE was 26.7 mm. In case the movements were performed below the shoulder or cyclically (as is the case in rehabilitation processes), the error decreased. The horizontal adduction gesture generated a greater error in the estimation, since it was carried out in the limit of the movements that were classified in the upper and lower part of the shoulder. Furthermore, this gesture also shared zones of movement with all the other gestures.

6. Discussion

During the calibration process, it was difficult to correctly place the frame without the appropriate measurement system for the angles generated by the shoulder. It was identified that initial position calibration can be improved using structures fixed to the ground or to the user’s hip. With an abduction of more than 90 , a scapular movement was generated, causing a displacement of the superior sADS and, consequently, errors in the measurement. In experiments of over 30 min with the users using the shirt, no evident change in the behaviour of the sADS signals was observed due to prolonged periods of time; the IMUs were re-calibrated in each data collection. In this work, IMUs were used only for the shoulder rotation gesture. For defined tasks, the sensing of shoulder rotation could have been eliminated, removing the use of IMUs and facilitating donning-doffing and estimation calculations. This work will be applied on an exosuit and integrated into an online system; it may use a different classification method than used on this study due to the time response, and a robustness investigation is needed.
The maximum calculated error of the calibration of the OptiTrack cameras was 0.343 mm and can be considered in the comparison of the method of estimating the length of the arm and forearm, being able to improve or worsen within that range. If the noise from the sADS is not reduced, a projected noise is produced at the wrist. Similarly, not using the arm and forearm distances estimated by the L-shaped structure method increases the error. These behaviours increase the error up to 3.5 and are projected in the final position (wrist), increasing up to ≈ 50 mm.
Our results are similar to [35], where, on a cyclic movement, there is an RMSE of [1.42 , 3.89 ] (best, worst) and, on free or random movements [1.88 , 4.41 ]. We also searched for the optimal sensor arrangement, finding four essential sensors for the estimation of the orientation. On the contrary, we found that two sensors can represent 92% of the ROM without adding scapular movement. Similar studies used piezoresistive strain sensors directly adhered to the skin to estimate shoulder ROM; the comparison between reference data from OptiTrack and the strain sensors showed a RMSE less than 10 in shoulder flexion/extension and abduction/adduction estimation [36]. In the case of [37], which used a smart compression garment, the best result for angle measurement of the elbow with 9.98 of error was obtained. An interesting study using 20 self-made soft strain sensors [38] showed an overall error position of the full-body motion tracking of [29.5 mm, 38.7 mm] (best, worst case) and results for the shoulder joint with just two sensors of [30.9 mm, 47.1 mm] for the elbow [13.5 mm and 34.7 mm] and for the wrist [27.3 and 43.5] using OptiTrack as ground truth.

7. Conclusions

Wearable flexible sensors are becoming common as a technology used for motion capture. The shoulder is a complex joint and plays an important role in daily living activities. Therefore, designing portable and accurate arm motion capture system can be challenging. This document reports a device capable of obtaining a mean RMSE of 3.54 in the angle measurement of the shoulder. Using the angles and the length of the arm and forearm and compared with a GTS, a RMSE of 29.1 mm was found for the estimation of the position of the wrist. This performance may be useful in applications of physiotherapy, kinesiology or gaming. However, due to the lack of accuracy at a millimetre level, the use of this device for robotic or biomechanical systems could limit its approaches. Still, a great possibility can be exploited in different applications that require monitoring of the human body in open spaces.

Author Contributions

Conceptualization, A.-F.C.-G.; Data curation, F.J.S.-S.; Formal analysis, A.-F.C.-G.; Investigation, A.-F.C.-G.; Project administration, M.F.; Supervision, M.F., M.Á.S.-U. and F.B.H.; Writing—original draft, A.-F.C.-G.; Writing—review & editing, A.-F.C.-G. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been partially supported by the project “LUXBIT: Lightweight Upper limbs eXosuit for BImanual Task Enhancement” under RTI2018-094346-B-I00 grant, funded by the Spanish “Ministerio de Ciencia, Innovación y Universidades”.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. System Overview Detailed Description

Appendix A.1. 3D Printed Parts for sADS

Figure A1. Design of the parts for the sADS sensors. (a) Ensures the correct posing on the fixed part. (b) Allows correct sliding on the movable part, not only eliminating the problem of skin stretching but also creating a straight line formed by the limb.
Figure A1. Design of the parts for the sADS sensors. (a) Ensures the correct posing on the fixed part. (b) Allows correct sliding on the movable part, not only eliminating the problem of skin stretching but also creating a straight line formed by the limb.
Sensors 20 06452 g0a1

Appendix A.2. sADS Placement

We created two identical shirts using the following method to place two sensors (see Figure 3) for the shoulder and one for the elbow:
  • With the shoulder abducted at 90 , a sADS is placed over the printed parts and perched on top of the shoulder. Then, the arm is centred in the X Y plane. The user is placed at rest. The fixed part is placed 25.0 mm from the corner of the generated angle and the long sliding part is placed on the vertical part of the arm. The distance between the pieces must be between 50 and 60 mm.
  • The back sADS is placed 40 achieving 90 from the abduction, right in the centre of the arm and drawing a straight line from the sADS on the upper shoulder.
  • The elbow sADS is placed on the outside of the arm, with the centre of the sensor coinciding with the elbow hinge. The distance between the pieces should be between 50 and 60 mm.

Appendix A.3. Synchronisation Algorithm

To synchronise the files, the frequencies of each device, the start time and the duration time of each shot are used. The file that starts first is trimmed until it has the same start time. The file that ends at the end is trimmed until it has the same end time. Finally, a linear interpolation is performed to ensure the same number of samples in each file:
  • The start time of each file is acquired. The file that started first is identified as file A, the other as file B: t I n i t A and t I n i t B and the amount of samples of each file is obtained:
    l e n A = l e n g t h ( A ) l e n B = l e n g t h ( B )
  • The final time and the real frequency of each file are obtained: t E n d A = t I n i t A + l a s t T i m e s t a m p A and t E n d B = t I n i t B + l a s t T i m e s t a m p B ; and the frequency is given by:
    f q A = l e n A t E n d A t I n i t A f q B = l e n B t E n d B t I n i t B
  • Cut out the initial n samples from file A:
    n = c e i l ( ( t I n i t B t I n i t A ) f q A )
  • Check which file ends first. The difference between the end times is calculated as d i f f = a b s ( t E n d A t E n d B ) . Cut out the final m samples from the file:
    i f ( t E n d A > t E n d B ) m = c e i l ( d i f f f q A ) e l s e m = c e i l ( d i f f f q B )
  • Calculate a f a c t o r = ( l e n g t h ( f i l e A ) / l e n g t h ( f i l e B ) ) and generate a new vector containing the new length from the shortest:
    n e w V e c = l i n s p a c e 1 , l e n g t h ( f i l e A ) , l e n g t h ( f i l e A ) f a c t o r
    The interpolation for each column of the original file is performed:
    f i l e A I n t e r p o l = i n t e r p 1 ( f i l e A , n e w V e c , l i n e a r )
MATLAB Software is used in this algorithm to perform line-space and data interpolation.

Appendix B. Estimation Method Detailed Description

Appendix B.1. Fusion Sensor Algorithm

Applying the Central Limit Theorem [39], it is assumed that θ 1 and θ 2 denote the two sADS measurements on the same plane with noise variances σ 1 2 and σ 2 2 , respectively.
θ 3 = σ 3 2 ( σ 1 2 θ 1 + σ 2 2 θ 2 )
where the variance of the combined estimate is:
σ 3 2 = ( ( σ 1 ) 2 + ( σ 2 ) 2 ) 1
where variance is defined as:
σ = 1 N 1 i = 1 N | A i μ | 2
and μ is the mean of A,
μ = 1 N i = 1 N A i
The estimated value of variance for the sensors in both planes is: σ X Y = ( 203.013 + 218.315 ) / 2 and σ X Z = ( 205.523 + 239.986 ) / 2 . Therefore: σ 3 2 = ( 20860.176 + 26079.470 ) / 2 . For the fusion of the sensors with the compensation of the classification by gesture, it is evaluated by adding two factors as follows:
θ 3 = σ 3 2 σ 1 2 a F a c t o r θ 1 + σ 2 2 b F a c t o r θ 2

Appendix B.2. L-Shaped Structure Details

In the design of the structure, it is recommended that the distance of the segment A B ¯ be approximately 60 cm, since it is the average length of an adult’s arm, as well as the magnitude of the segment D C ¯ to be 1.33 times the segment A D ¯ . The length of the vertical pins is irrelevant, since its projection is considered in the calculations. It is worth mentioning that this method can be applied on both arms, replicating the structure in mirror mode for the opposite side. Points B and B must move together on the x-axis, and point B moves also on the y-axis. Points A and C are fixed to the structure, as shown in Figure A2.
The structure (Figure 6) is located on a table, with the point A on the right side of the user (to align with the z-axis) and point B in front of them. The steps to make the estimation are the following:
  • An abduction of the shoulder is performed at 90 , leaving the elbow at 0 , and achieving an angle in the X Y and X Z planes of 0 ± 3 . While the user maintains this pose, the entire structure is moved so that the magnet on the pin in the fixed point A touches the magnet in the centre of the wrist.
  • The user extends the arm to the front performing a shoulder flexion at 90 and keeping the elbow at 0 . The movable point B is adjusted on the x-axis of the structure by making it coincide with the forearm. The movable point B is then adjusted along the y-axis until it coincides with the new point of the magnet on the wrist, preserving the previous point (A). This way, a right angle is generated between the point A and B as shown in Figure A2.
  • The wrist is then moved from point B to point B generating a flexion in the elbow. The angles generated with this gesture are acquired (see gesture Figure) with the help of the sADS.
  • Finally, the centre of the wrist is placed on the point C, preserving the pose of the structure, and the angles generated with this gesture are acquired (see gesture (Figure A2b) with the help of the sADS.
Conditions:
  • The angles generated in steps 1 and 2 towards the point A and B must be 0 ± 3 and 90 ± 3 in the X Y plane, respectively. At the elbow, they must be 0 ± 3 in X Y in both steps.
  • Step 3 is generated with the shoulder angle greater than 40 and the elbow angle greater than 10 , both in the X Y plane.
  • Step 4 is generated with the shoulder angle greater than 90 and the elbow angle greater than 10 , both in the X Y plane.
Figure A2. A top view representation of the structure and the generated angles by the arm: (a) The solid black line represents the structure and its properties, and the dotted line represents the pose that the user must adopt in the main gestures. (b) Shows arm pose at the point B to estimate the length of the arm (green colour) and forearm (blue colour) with the known distance A B ¯ . (c) Gesture performed for redundant calculation towards point C.
Figure A2. A top view representation of the structure and the generated angles by the arm: (a) The solid black line represents the structure and its properties, and the dotted line represents the pose that the user must adopt in the main gestures. (b) Shows arm pose at the point B to estimate the length of the arm (green colour) and forearm (blue colour) with the known distance A B ¯ . (c) Gesture performed for redundant calculation towards point C.
Sensors 20 06452 g0a2
To obtain the length estimates of the segments of an upper limb, the angle of the shoulder in the X Y plane is considered to be α 1 and in the X Z plane is ϕ 1 . The angle of the elbow is defined as α 2 .
L 1 and L 2 are the projection on the X Y plane of the segment of the humerus and the segment generated by the radius and ulna bones. The value of the segment A B ¯ is a distance known from the structure. Therefore, for the gesture generated in step 3 it can be said that:
θ 1 = π 2 α 1
and by Euclid’s fifth postulate,
θ 2 = π ( θ 1 + α 2 )
By the law of sines, and considering the angle of the plane X Z :
a r m l e n g t h = A B ¯ sin ( θ 2 ) sin ( α 2 ) cos ( ϕ 1 )
f o r e a r m l e n g t h = A B ¯ sin ( θ 1 ) sin ( α 2 ) cos ( ϕ 1 )
In step four, point C is used only as a redundant point in the calculation using the first two estimated distances L 1 and L 2 . By the law of sines, and considering the angle of the plane X Z , the redundant estimation is found as:
a r m l e n g t h = S C ¯ sin θ 5 sin α 2 cos ( ϕ 1 )
f o r e a r m l e n g t h = S C ¯ sin θ 4 sin α 2 cos ( ϕ 1 )
The explanation to get to this point is presented in Appendix B.3. This arm and forearm length estimation process needs to be done only once per user.

Appendix B.3. Redundant Calculation of Arm and Forearm Length

Redundant calculations are possible once the first estimate is obtained. Considering the Figure A2, using the first two estimated distances L 1 and L 2 , it is possible to estimate d 1 distance, where d 1 is the projection of the total length of the frame minus the length of the extended arm
d 1 = C D ¯ ( L 1 + L 2 )
and that the distance S C ¯ generates a right angle being the hypotenuse for AB A B ¯ y d 1
S C ¯ = A B ¯ 2 + d 1 2
Then, the angle θ 3 can be easily calculated as:
θ 3 = tan 1 A B ¯ d 1
since the angle generated by the arm α 1 is known, and, having calculated θ 3 , it can be said that:
θ 4 = π ( θ 3 + α 1 )
and from Euclid’s fifth postulate,
θ 5 = π ( θ 4 + α 2 )
By the law of sines and considering the angle of the plane X Z , the redundant estimation is found as:
a r m l e n g t h = S C ¯ sin θ 5 sin α 2 c o s ( ϕ 1 )
f o r e a r m l e n g t h = S C ¯ sin θ 4 sin α 2 c o s ( ϕ 1 )

Appendix B.4. Denavit–Hartenberg Parameters

The Denavit–Hartenberg parameters and A i definitions of the kinematic chain are represented in Table A1. A 1 , A 2 , A 3 , A 4 and A 5 are given by the Equation (A25) where the four quantities, a i , d i , α i and θ i associated with link i and joint i corresponding to the a x i s i on the Table A1. The arm length is ( a r m l e n g t h a l ) and forearm length ( f o r e a r m l e n g t h f l ).
A i = cos θ i sin θ i cos α i sin θ i sin α i a i cos θ i sin θ i cos θ i cos α i cos θ i sin α i a i sin θ i 0 sin α i cos α i d i 0 0 0 1
Table A1. Denavit–Hartenberg parameters of the kinematic model.
Table A1. Denavit–Hartenberg parameters of the kinematic model.
Axis, i a i d i α i θ i
100 π / 2 α 1
200 π / 2 ϕ 1
30 a l π / 2 δ 1
400 π / 2 α 2
50 f l 00
The T-matrices are thus given by:
T 3 0 = A 1 A 2 A 3
and
T 5 0 = A 1 A 2 A 3 A 4 A 5
The first three rows of the last column of T 3 0 are the x, y and z components of the origin in the elbow; that is,
e l b o w x = a l c ( α 1 ) s ( ϕ 1 )
e l b o w y = a l s ( α 1 ) s ( ϕ 1 )
e l b o w z = a l c ( ϕ 1 )
and the location for the wrist is defined by T 5 0 :
w r i s t x = a l c ( α 1 ) s ( ϕ 1 ) f l s ( α 2 ) s ( α 1 ) s ( δ 1 ) + c ( α 1 ) c ( ϕ 1 ) c ( δ 1 ) + c ( α 1 ) c ( α 2 ) s ( ϕ 1 )
w r i s t y = f l s ( α 2 ) c ( α 1 ) s ( δ 1 ) c ( ϕ 1 ) c ( δ 1 ) s ( α 1 ) c ( α 2 ) s ( α 1 ) s ( ϕ 1 ) + a l s ( α 1 ) s ( ϕ 1 )
w r i s t z = f l c ( ϕ 1 ) c ( α 2 ) c ( δ 1 ) s ( ϕ 1 ) s ( α 2 ) a l c ( ϕ 1 )

Appendix C. Experiment Detailed Description

Appendix C.1. Classifier

As training data, the signals obtained from the sADS sensors were used, as well as the type of movement, which was the response to the classifier, captured in the interface during the sequential sessions. Movement types were stored as numeric variables. The Classification learner tool was used to train the different models using supervised machine learning with the Statistics and Machine learning Toolbox 11.7. The cross-validation was selected on five folds and the PCA enabled with a 95% of the explained variance. All 24 methods were trained with 70% of the dataset, 15% for validation and 15% for testing.
Table A2. Gesture classification training results. The model with the shortest training time and smaller misclassification cost were chosen among the models with the highest accuracy (%).
Table A2. Gesture classification training results. The model with the shortest training time and smaller misclassification cost were chosen among the models with the highest accuracy (%).
Model TypeAccuracyModel TypeAccuracyModel TypeAccuracy
Fine
Tree
92.3Quadratic
SVM
83.9Cosine
KNN
95.7
Medium
Tree
77.5Cubic
SVM
74.1Cubic
KNN
99.7
Coarse
Tree
59.6Fine Gaussian
SVM
95.1Weighted KNN99.9
Linear
Discriminant
65.6Medium
Gaussian SVM
86.0Boosted Trees84.3
Quadratic
Discriminant
76.9Coarse
Gaussian SVM
74.8Bagged Trees99.7
Gaussian
Naive Bayes
61.4Fine KNN99.9Subspace
Discriminant
65.9
Kernel
Naive Bayes
67.8Medium KNN99.7Subspace
KNN
98.3
Linear SVM68.5Coarse KNN94.6RUSBoosted Trees77.5
There are different models that could be used. For this study, the nearest neighbour classifier (k-NN) was selected, which typically has good predictive accuracy in low dimensions. This classifier has an average prediction speed (≈ 1 s) and an average memory usage (≈ 4 MB). Another model, such as the Fine Tree, could have been selected if the required response time should be less (≈ 0.01 s). The k-NN response is shown in the Figure A3 which shows a 99.9% score in the classification. The values 1 to 5 are the representation of the gestures, the true positive rate (TPR) is mostly 100% accurate. The false negative rate (FNR) is 0.1% on the over the shoulder movement.
Figure A3. Confusion matrix from the fine nearest neighbour classifier.
Figure A3. Confusion matrix from the fine nearest neighbour classifier.
Sensors 20 06452 g0a3

Appendix C.2. Weighting Algorithm

Algorithm 1: Movement Classification
Sensors 20 06452 i001

References

  1. Van der Kruk, E.; Reijne, M.M. Accuracy of human motion capture systems for sport applications; state-of-the-art review. Eur. J. Sport Sci. 2018, 18, 806–819. [Google Scholar] [CrossRef] [PubMed]
  2. Rigoni, M.; Gill, S.; Babazadeh, S.; Elsewaisy, O.; Gillies, H.; Nguyen, N.; Pathirana, P.N.; Page, R. Assessment of shoulder range of motion using a wireless inertial motion capture device—A validation study. Sensors 2019, 19, 1781. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Fong, D.T.P.; Chan, Y.Y. The use of wearable inertial motion sensors in human lower limb biomechanics studies: A systematic review. Sensors 2010, 10, 11556–11565. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Patel, S.; Park, H.; Bonato, P.; Chan, L.; Rodgers, M. A review of wearable sensors and systems with application in rehabilitation. J. Neuroeng. Rehabil. 2012, 9, 1–17. [Google Scholar] [CrossRef] [Green Version]
  5. Nag, A.; Mukhopadhyay, S.C.; Kosel, J. Wearable flexible sensors: A review. IEEE Sens. J. 2017, 17, 3949–3960. [Google Scholar] [CrossRef] [Green Version]
  6. Liu, H.; Li, Q.; Zhang, S.; Yin, R.; Liu, X.; He, Y.; Dai, K.; Shan, C.; Guo, J.; Liu, C.; et al. Electrically conductive polymer composites for smart flexible strain sensors: A critical review. J. Mater. Chem. C 2018, 6, 12121–12141. [Google Scholar] [CrossRef]
  7. Carnevale, A.; Longo, U.G.; Schena, E.; Massaroni, C.; Presti, D.L.; Berton, A.; Candela, V.; Denaro, V. Wearable systems for shoulder kinematics assessment: A systematic review. BMC Musculoskelet. Disord. 2019, 20, 546. [Google Scholar] [CrossRef]
  8. Sugimoto, S.I.; Takei, A.; Ogino, M. Finite element analysis with tens of billions of degrees of freedom in a high-frequency electromagnetic field. Mech. Eng. Lett. 2017, 3, 16-00667. [Google Scholar] [CrossRef] [Green Version]
  9. Winter, J.; Allen, T.J.; Proske, U. Muscle spindle signals combine with the sense of effort to indicate limb position. J. Physiol. 2005, 568, 1035–1046. [Google Scholar] [CrossRef]
  10. Xu, K.; Lu, Y.; Takei, K. Multifunctional Skin-Inspired Flexible Sensor Systems for Wearable Electronics. Adv. Mater. Technol. 2019, 4, 1800628. [Google Scholar] [CrossRef] [Green Version]
  11. Lou, Z.; Wang, L.; Jiang, K.; Shen, G. Programmable three-dimensional advanced materials based on nanostructures as building blocks for flexible sensors. Nano Today 2019, 26, 176–198. [Google Scholar] [CrossRef]
  12. Chen, W.; Yan, X. Progress in achieving high-performance piezoresistive and capacitive flexible pressure sensors: A review. J. Mater. Sci. Technol. 2020, 43, 175–188. [Google Scholar] [CrossRef]
  13. Yang, Y.; Gao, W. Wearable and flexible electronics for continuous molecular monitoring. Chem. Soc. Rev. 2019, 48, 1465–1491. [Google Scholar] [CrossRef] [PubMed]
  14. Nguyen, K.D.; Chen, I.M.; Luo, Z.; Yeo, S.H.; Duh, H.B.L. A wearable sensing system for tracking and monitoring of functional arm movement. IEEE/ASME Trans. Mechatron. 2010, 16, 213–220. [Google Scholar] [CrossRef]
  15. Varghese, R.J.; Lo, B.P.L.; Yang, G.Z. Design and prototyping of a bio-inspired kinematic sensing suit for the shoulder joint: Precursor to a multi-dof shoulder exosuit. IEEE Robot. Autom. Lett. 2020, 5, 540–547. [Google Scholar] [CrossRef] [Green Version]
  16. Proske, U.; Gandevia, S.C. The kinaesthetic senses. J. Physiol. 2009, 587, 4139–4146. [Google Scholar] [CrossRef] [PubMed]
  17. Varghese, R.J.; Guo, X.; Freer, D.; Liu, J.; Yang, G.Z. A simulation-based feasibility study of a proprioception-inspired sensing framework for a multi-dof shoulder exosuit. In Proceedings of the 2019 IEEE 16th International Conference on Wearable and Implantable Body Sensor Networks (BSN), Chicago, IL, USA, 19–22 May 2019; pp. 1–4. [Google Scholar]
  18. Xiong, Y.; Tao, X. Compression garments for medical therapy and sports. Polymers 2018, 10, 663. [Google Scholar] [CrossRef] [Green Version]
  19. Samper-Escudero, J.L.; Contreras-González, A.F.; Ferre, M.; Sánchez-Urán, M.A.; Pont-Esteban, D. Efficient Multiaxial Shoulder-Motion Tracking Based on Flexible Resistive Sensors Applied to Exosuits. Soft Robot. 2020, 7, 370–385. [Google Scholar] [CrossRef]
  20. Poitras, I.; Dupuis, F.; Bielmann, M.; Campeau-Lecours, A.; Mercier, C.; Bouyer, L.J.; Roy, J.S. Validity and reliability of wearable sensors for joint angle estimation: A systematic review. Sensors 2019, 19, 1555. [Google Scholar] [CrossRef] [Green Version]
  21. Poitras, I.; Bielmann, M.; Campeau-Lecours, A.; Mercier, C.; Bouyer, L.J.; Roy, J.S. Validity of wearable sensors at the shoulder joint: Combining wireless electromyography sensors and inertial measurement units to perform physical workplace assessments. Sensors 2019, 19, 1885. [Google Scholar] [CrossRef] [Green Version]
  22. Esfahani, M.I.M.; Nussbaum, M.A. A “smart” undershirt for tracking upper body motions: Task classification and angle estimation. IEEE Sens. J. 2018, 18, 7650–7658. [Google Scholar] [CrossRef]
  23. Mengüç, Y.; Park, Y.L.; Pei, H.; Vogt, D.; Aubin, P.M.; Winchell, E.; Fluke, L.; Stirling, L.; Wood, R.J.; Walsh, C.J. Wearable soft sensing suit for human gait measurement. Int. J. Robot. Res. 2014, 33, 1748–1764. [Google Scholar] [CrossRef]
  24. Feng, Y.; Li, Y.; McCoul, D.; Qin, S.; Jin, T.; Huang, B.; Zhao, J. Dynamic Measurement of Legs Motion in Sagittal Plane Based on Soft Wearable Sensors. J. Sens. 2020, 2020, 1–10. [Google Scholar] [CrossRef]
  25. Huang, B.; Li, M.; Mei, T.; McCoul, D.; Qin, S.; Zhao, Z.; Zhao, J. Wearable stretch sensors for motion measurement of the wrist joint based on dielectric elastomers. Sensors 2017, 17, 2708. [Google Scholar] [CrossRef] [Green Version]
  26. Rosen, J.; Perry, J.C.; Manning, N.; Burns, S.; Hannaford, B. The human arm kinematics and dynamics during daily activities-toward a 7 DOF upper limb powered exoskeleton. In Proceedings of the ICAR’05 12th International Conference on Advanced Robotics, Seattle, WA, USA, 18–20 July 2005; pp. 532–539. [Google Scholar]
  27. Zanchettin, A.M.; Rocco, P.; Bascetta, L.; Symeonidis, I.; Peldschus, S. Kinematic motion analysis of the human arm during a manipulation task. In Proceedings of the ISR 2010 (41st International Symposium on Robotics) and ROBOTIK 2010 (6th German Conference on Robotics), Munich, Germany, 7–9 June 2010; pp. 1–6. [Google Scholar]
  28. Contreras-González, A.F.; Samper-Escudero, J.L.; Pont-Esteban, D.; Sáez-Sáez, F.J.; Sánchez-Urán, M.Á.; Ferre, M. Soft-Wearable Device for the Estimation of Shoulder Orientation and Gesture. In International Conference on Human Haptic Sensing and Touch Enabled Computer Applications; Springer: Berlin/Heidelberg, Germany, 2020; pp. 371–379. [Google Scholar]
  29. Reese, S.P. Angular Displacement Sensor of Compliant Material. U.S. Patent 8,941,392, 27 January 2015. [Google Scholar]
  30. Labs, B. Bend Labs. Internet Draft. 2020. Available online: http://xxx.lanl.gov/abs/https://www.bendlabs.com/products/2-axis-soft-flex-sensor/ (accessed on 22 June 2020).
  31. Pont, D.; Contreras, A.F.; Samper, J.L.; Sáez, F.J.; Ferre, M.; Sánchez, M.Á.; Ruiz, R.; García, Á. ExoFlex: An Upper-Limb Cable-Driven Exosuit. In Iberian Robotics Conference; Springer: Berlin/Heidelberg, Germany, 2019; pp. 417–428. [Google Scholar]
  32. OptiTrack. OptiTrack for Movement Sciences. Internet Draft. 2020. Available online: https://optitrack.com/ (accessed on 25 June 2020).
  33. Schmidt, R.; Disselhorst-Klug, C.; Silny, J.; Rau, G. A marker-based measurement procedure for unconstrained wrist and elbow motions. J. Biomech. 1999, 32, 615–621. [Google Scholar] [CrossRef]
  34. Soslowsky, L.; Flatow, E.; Bigliani, L.; Mow, V. Articular geometry of the glenohumeral joint. Clin. Orthop. Relat. Res. 1992, 285, 181–190. [Google Scholar] [CrossRef]
  35. Jin, Y.; Glover, C.M.; Cho, H.; Araromi, O.A.; Graule, M.A.; Li, N.; Wood, R.J.; Walsh, C.J. Soft Sensing Shirt for Shoulder Kinematics Estimation. In Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France, 31 May–31 August 2020; pp. 4863–4869. [Google Scholar]
  36. Lee, H.; Cho, J.; Kim, J. Printable skin adhesive stretch sensor for measuring multi-axis human joint angles. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; pp. 4975–4980. [Google Scholar]
  37. Greenspan, B.; Lobo, M.A. Design and Initial Testing of an Affordable and Accessible Smart Compression Garment to Measure Physical Activity Using Conductive Paint Stretch Sensors. Multimodal Technol. Interact. 2020, 4, 45. [Google Scholar] [CrossRef]
  38. Kim, D.; Kwon, J.; Han, S.; Park, Y.L.; Jo, S. Deep full-body motion network for a soft wearable motion sensing suit. IEEE/ASME Trans. Mechatron. 2018, 24, 56–66. [Google Scholar] [CrossRef]
  39. Maybeck, P.S. Stochastic Models, Estimation, and Control; Academic Press: Cambridge, MA, USA, 1982. [Google Scholar]
Figure 1. Different locations for the variance estimation. (a) Four soft angular displacement sensors (sADS) were placed in the recommended intermediate poses. (b) The frontal sADS was eliminated due to its low representation in the variance; it also generated obstructions and poor measurements in the horizontal gestures. The three previous poses were preserved. (c) Placement of two sADS, one in the centre of the two sADS of the top three sensor array, and the other in the rear, generating an angle of 90 between the two planes with respect to the first.
Figure 1. Different locations for the variance estimation. (a) Four soft angular displacement sensors (sADS) were placed in the recommended intermediate poses. (b) The frontal sADS was eliminated due to its low representation in the variance; it also generated obstructions and poor measurements in the horizontal gestures. The three previous poses were preserved. (c) Placement of two sADS, one in the centre of the two sADS of the top three sensor array, and the other in the rear, generating an angle of 90 between the two planes with respect to the first.
Sensors 20 06452 g001
Figure 2. Gestures that define the work space of the right arm. (a) Rest position. (b) Flexion/extension gesture corresponding to the Y Z plane. (c) Horizontal abduction/adduction corresponding to the plane X Y . (d) Abduction/adduction gesture corresponding to the X Z plane. (e) Depiction of the rotation of the shoulder, named δ 1 .
Figure 2. Gestures that define the work space of the right arm. (a) Rest position. (b) Flexion/extension gesture corresponding to the Y Z plane. (c) Horizontal abduction/adduction corresponding to the plane X Y . (d) Abduction/adduction gesture corresponding to the X Z plane. (e) Depiction of the rotation of the shoulder, named δ 1 .
Sensors 20 06452 g002
Figure 3. Location of sensors on the compression jacket. (a) Back view. The reflective markers are shown in bright grey, the IMU 1 and the two sADS on the shoulder (upper and back) and the sADS on the elbow. (b) Front view, which shows two reflective markers and the IMU 2 .
Figure 3. Location of sensors on the compression jacket. (a) Back view. The reflective markers are shown in bright grey, the IMU 1 and the two sADS on the shoulder (upper and back) and the sADS on the elbow. (b) Front view, which shows two reflective markers and the IMU 2 .
Sensors 20 06452 g003
Figure 4. Flow diagram of the data acquisition system. On the left side, the movements of the user, the generated angles and the poses of the markers are shown; in the centre, the acquisition of data and communication between boards as well as the storage of positions of the reflective pointers; on the right side, the data analysis based on the timestamp of each file.
Figure 4. Flow diagram of the data acquisition system. On the left side, the movements of the user, the generated angles and the poses of the markers are shown; in the centre, the acquisition of data and communication between boards as well as the storage of positions of the reflective pointers; on the right side, the data analysis based on the timestamp of each file.
Sensors 20 06452 g004
Figure 5. Example of similar gestures in two different takes. (a) First take shows the behaviour of the sensor without the filter. (b) Second take of a similar gesture shows the behaviour with the filter.
Figure 5. Example of similar gestures in two different takes. (a) First take shows the behaviour of the sensor without the filter. (b) Second take of a similar gesture shows the behaviour with the filter.
Sensors 20 06452 g005
Figure 6. Calibration frame top view. (a) Shows arm pose at the point B of the frame to estimate the length of the arm and forearm with a known distance A B ¯ . (b) Gesture performed for redundant calculation towards point C. (c) It shows the bracelet tight and a magnet to guarantee the same pose of the wrist with the pin on the frame.
Figure 6. Calibration frame top view. (a) Shows arm pose at the point B of the frame to estimate the length of the arm and forearm with a known distance A B ¯ . (b) Gesture performed for redundant calculation towards point C. (c) It shows the bracelet tight and a magnet to guarantee the same pose of the wrist with the pin on the frame.
Sensors 20 06452 g006
Figure 7. Performance of the sADS angles compared with the GTS in a series test.
Figure 7. Performance of the sADS angles compared with the GTS in a series test.
Sensors 20 06452 g007
Table 1. Result of the estimated variance for different sADS arrays.
Table 1. Result of the estimated variance for different sADS arrays.
Type of SensorAmount of SensorsVariance
Single-axis resistive795%
Two-axis capacitive493%
Two-axis capacitive392%
Two-axis capacitive292%
Table 2. Merged data is obtained in two different fusions using the shoulder ADS sensors.
Table 2. Merged data is obtained in two different fusions using the shoulder ADS sensors.
FusionUpper sADSBack sADS
Plane X Y ( α 1 )xy-axis θ 1 xz-axis θ 2
Plane X Z ( ϕ 1 )xz-axis θ 1 xy-axis θ 2
Table 3. Comparison between the estimation made by the proposed algorithm with the ground truth system (GTS) and the tape measure of the arm and forearm lengths of the subjects (mm).
Table 3. Comparison between the estimation made by the proposed algorithm with the ground truth system (GTS) and the tape measure of the arm and forearm lengths of the subjects (mm).
Arm Forearm
Subject TapeEstimatedGTS TapeEstimatedGTS
1300.0311.52310.98 250.0246.20246.83
2330.0331.40332.08 270.0260.44261.32
3345.0340.78339.12 300.0291.83290.55
4310.0312.86311.98 260.0272.08273.56
5334.0332.27332.98 260.0258.40256.56
6295.0305.89304.63 250.0249.32249.12
7330.0328.63326.35 270.0277.42278.77
8315.0309.43308.66 255.0260.18260.80
9340.0337.82339.63 295.0302.81299.68
Table 4. Performance of the proposed frame (RMSE). Comparison between measurements.
Table 4. Performance of the proposed frame (RMSE). Comparison between measurements.
Tape vs. EstimatedTape vs. GTEstimated vs. GT
Arm5.96 mm5.87 mm1.31 mm
Forearm7.18 mm7.48 mm1.50 mm
Table 5. Orientation error in root-mean-square error (RMSE) obtained by the ground truth system (GTS) when performing the movements and gestures. Elbow orientation error was compared even in gestures that did not involve elbow movement.
Table 5. Orientation error in root-mean-square error (RMSE) obtained by the ground truth system (GTS) when performing the movements and gestures. Elbow orientation error was compared even in gestures that did not involve elbow movement.
SensorCyclic
Mov.
Free
Mov.
Below the
Shoulder
Over the
Shoulder
Flexion/
Extension
Abd./
Add.
Horizontal
Adduction
Rotation
Upper sADS 6.84 7.33 1.52 4.18 10.53 4.22 3.71 n/a
Back sADS 4.50 5.67 2.01 8.39 2.97 1.93 9.66 n/a
IMU 1.93 2.86 2.96 1.66 2.30 1.89 2.23 1.51
Elbow sADS 0.43 0.93 1.73 1.53 0.54 0.38 0.78 n/a
Mean 3.42 4.19 2.05 3.94 4.08 2.10 4.09 n/a
Table 6. Error for the estimated position by the shoulder sADS given in RMSE (mm).
Table 6. Error for the estimated position by the shoulder sADS given in RMSE (mm).
Elbow PositionWrist Position
Cyclic
Mov.
Free
Mov.
Below the
Shoulder
Over the
Shoulder
Cyclic
Mov.
Free
Mov.
Below
Shoulder
Over the
Shoulder
Upper sADS39.761.531.746.5 59.379.258.049.7
Back sADS35.443.830.358.6 36.348.844.382.2
Table 7. Weights used in Equation (A11) to reduce the estimation error.
Table 7. Weights used in Equation (A11) to reduce the estimation error.
Below
Shoulder
Over the
Shoulder
Flexion/
Extension
Abduction/
Adduction
Horizontal
Adduction
Upper sADS XY → aFactor0.42.52.31.62.3
Back sADS XZ → bFactor2.21.71.92.51.0
Upper sADS XZ → aFactor2.02.50.42.51.2
Back sADS XZ → bFactor2.22.32.71.82.7
Table 8. Fusion RMSE for position estimation (mm).
Table 8. Fusion RMSE for position estimation (mm).
Cyclic
Mov.
Free
Mov.
Below the
Shoulder
Over the
Shoulder
Elbow12.331.614.139.5
Wrist14.940.920.340.4
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Contreras-González, A.-F.; Ferre, M.; Sánchez-Urán, M.Á.; Sáez-Sáez, F.J.; Blaya Haro, F. Efficient Upper Limb Position Estimation Based on Angular Displacement Sensors for Wearable Devices. Sensors 2020, 20, 6452. https://0-doi-org.brum.beds.ac.uk/10.3390/s20226452

AMA Style

Contreras-González A-F, Ferre M, Sánchez-Urán MÁ, Sáez-Sáez FJ, Blaya Haro F. Efficient Upper Limb Position Estimation Based on Angular Displacement Sensors for Wearable Devices. Sensors. 2020; 20(22):6452. https://0-doi-org.brum.beds.ac.uk/10.3390/s20226452

Chicago/Turabian Style

Contreras-González, Aldo-Francisco, Manuel Ferre, Miguel Ángel Sánchez-Urán, Francisco Javier Sáez-Sáez, and Fernando Blaya Haro. 2020. "Efficient Upper Limb Position Estimation Based on Angular Displacement Sensors for Wearable Devices" Sensors 20, no. 22: 6452. https://0-doi-org.brum.beds.ac.uk/10.3390/s20226452

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop