Next Article in Journal
Efficient Chaos-Based Substitution-Box and Its Application to Image Encryption
Previous Article in Journal
Car Tourist Trajectory Prediction Based on Bidirectional LSTM Neural Network
Previous Article in Special Issue
Optimal Planning of Real-Time Bus Information System for User-Switching Behavior
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Making Sense of Complex Sensor Data Streams

by
Rongrong Liu
1,* and
Birgitta Dresp-Langley
2,*
1
ICube Lab, Robotics Department, Strasbourg University, 67085 Strasbourg, France
2
ICube Lab, UMR 7357 Centre National de la Recherche Scientifique CNRS, Strasbourg University, 67085 Strasbourg, France
*
Authors to whom correspondence should be addressed.
Submission received: 6 May 2021 / Revised: 3 June 2021 / Accepted: 9 June 2021 / Published: 10 June 2021
(This article belongs to the Special Issue Real-Time Data Management and Analytics)

Abstract

:
This concept paper draws from our previous research on individual grip force data collected from biosensors placed on specific anatomical locations in the dominant and non-dominant hand of operators performing a robot-assisted precision grip task for minimally invasive endoscopic surgery. The specificity of the robotic system on the one hand, and that of the 2D image-guided task performed in a real-world 3D space on the other, constrain the individual hand and finger movements during task performance in a unique way. Our previous work showed task-specific characteristics of operator expertise in terms of specific grip force profiles, which we were able to detect in thousands of highly variable individual data. This concept paper is focused on two complementary data analysis strategies that allow achieving such a goal. In contrast with other sensor data analysis strategies aimed at minimizing variance in the data, it is necessary to decipher the meaning of intra- and inter-individual variance in the sensor data on the basis of appropriate statistical analyses, as shown in the first part of this paper. Then, it is explained how the computation of individual spatio-temporal grip force profiles allows detecting expertise-specific differences between individual users. It is concluded that both analytic strategies are complementary and enable drawing meaning from thousands of biosensor data reflecting human performance measures while fully taking into account their considerable inter- and intra-individual variability.

1. Introduction

Wireless technology provides the ability to communicate or control over distances without requiring of wires or cables, or any other electrical conductors, but using electromagnetic waves, which were first conclusively proved to exist by the German physicist Heinrich Hertz. After continuous efforts over a century, many types of wireless systems have emerged, such as Infrared, Bluetooth, Wireless-Fidelity (WIFI), Radio Frequency Identification (RFID), Global Positioning System (GPS), etc. Presently, wireless technology plays a key role throughout the world, and the applications could be found for communications in cities [1], public buildings [2], individual houses [3], cars [4], people [5], animals [6].
Along with the advance of wireless technology, and other technique innovation in sensor design, electronic and power management, wireless wearable biosensor systems [7,8] are currently developing rapidly, which can convert a biological response into electrical measurements using electronic circuits, and allow transmiting the detected information remotely without using cables or wires, to a data acquisition platform [9]. Unlike conventional approaches, these devices enable convenient, continuous, unobtrusive and real-time monitoring and analysis [10] of signals, including chemical signals such as gas and biomolecules, thermals signals such as fever and hypothermia, electrophysiological signals such as brainwave and cardiac activities, and physical signals such as pressure, motion and, as will be shown in this paper, individual grip force data.
Individual grip force distributions necessary to complete functional tasks exhibit functional redundancies and depend on the type of task [11]. For example, when a cylindrical object needs to be manipulated, the middle finger will play an important role in the generation of gross force contribution, especially when the task consists of lifting a heavy object. In spherical grasp patterns, the contribution of the ring and small fingers is important to the total grip force. Individual finger forces have also been found to differ by gender [12] and age [13], indicating changes in the processing of fine motor control tasks with increasing age, presumably caused by difficulties of late middle-aged adults to produce any required amount of force rapidly.
This paper here is focused on data analysis strategies for the specific case of a precision grip task for the control of a robot-assisted surgical platform. The specific characteristics of the hard and software components of the robotic system are described in full detail in our previous publications [10,14,15,16,17,18]. The robotic surgical system [19] is a prototype, and data are currently available from one highly proficient expert user (with a total number of about 116 thousand of data available), one moderately trained user (153 thousand) and, for comparison, one complete novice user (171 thousand).
In a first part, we describe how issues relative to intra- and inter-individual variance in the data need to be dealt in the specific case of human precision grip force deployment. This involves particular statistical tools and metrics. Data variance in biosensor networks, such as the grip force sensor network in the human hand explored here in this Figure 1, is directly related to specific functional differences between locations where the sensors are placed (our own work), and needs to be fully taken into account for the interpretation of the data under the light of specific task constraints on the one hand, and biomechanical constraints directly related to hand and finger movements during grip force deployment on the other. This sheds a radically different light on analysis of variance by comparison with cases where variance in the sensor data needs to be eliminated or minimized [20].
In a second part, we illustrate a method, which will be described in detail in the data analysis section, that permits detecting individual spatio-temporal profiles in thousands of variable grip force data by focusing on task specific finger locations. How can help understand expertise-specific differences in grip force deployment between a highly proficient operator, a trainee, and a complete novice, and how these anatomically specific and task relevant grip forces evolve with time and level of training, will be shown.
The remainder of the paper is organised as follows. In Section 2, the materials and methods are explained, including the experiment platform, sensor glove and experiment design. The results are presented and the statistical analysis on time data, force data and functionally representative sensors are investigated in Section 3. In Section 4, the discussion of this study and some ideas for future work are proposed.

2. Materials and Methods

2.1. Experimental Platform

The experiment has been conducted on a robotic endoscope system called STRAS, which stands for Single access and Transluminal Robotic Assistant for Surgeons [19,21], aiming to optimally assist surgeons in minimally invasive procedures. It is designed for bi-manual intervention, and there is an endoscopic camera attached at the distal side of the robotic endoscope and filming the users performing the task. During the manipulation of the STARS endoscope system, as shown in Figure 2, the user can hold two joystick handles to move and manipulate the endoscope tools in each hand while facing a monitor screen that displays the camera view.

2.2. Sensor Glove Design

A pair of specific wearable wireless sensor gloves were developed [14,15], using inbuilt Force Sensitive Resistors (FSR). Each of the small (5–10 mm diameter) FSR was soldered to 10KΩ pull-down resistors to create a voltage divider. The voltage read by the analog input of the Arduino is given by Equation (1)
V o u t = R P D V 3.3 / ( R P D + F F S R )
where R P D the resistance of the pull down resistor, R F S R he FSR resistance, and V 3.3 is the 3.3 V supply voltage. FSR resistances can vary from 250 Ω when subjected to 20 Newton (N) to about 10 MΩ when no force is applied at all. The voltage varies monotonically between 0 and 3.22 Volt, as a function of the force applied, which is assumed uniform on the sensor surface. In the experiments here, forces applied did not exceed 10 N; voltages varied within the range of [0; 1500] mV. The relation between force and voltage is almost linear within this range. It was ensured that all sensors provided similar calibration curves.
These FSR have been first glued inside the glove, and then secured by sewing a circular piece of cloth around. For each glove, which is shown in Figure 3, twelve anatomically relevant FSR are employed to measure the grip force applied on certain locations on the fingers and in the palm, as illustrated in Figure 1.
The software to acquire grip force data includes two parts: one running on the Arduino Micro board embedded on the gloves, and the other running on the computer for data collection. Figure 4 shows this data acquisition process by taking Sensor 10 (S10) as an example. Powered by Li-Po battery, the Micro board provides a regulated voltage, acquires analog voltage output from each FSR sensor, which is merged with the time stamps and sensor identification. This data package is then sent to the computer wirelessly via Bluetooth with a frequency of 50 Hz, decoded by the computer software, and saved in an excel file.

2.3. Experimental Design

Equipped with the STRAS platform and the sensor gloves, a pick-and-drop image-guided robot-assisted task was designed for this individual grip force study. The experiment consists of manipulating the STRAS endoscope system with one hand while wearing the sensor glove. The precision grip task is entirely image-guided [10,14,15,16,17,18] and executed by a highly proficient expert user, a moderately trained user, and a complete novice. The expert is left handed while the trained user and the novice are right handed, as illustrated in Table 1. The left and right interfaces are identical, and this pick-and-drop task has been realized with each hand here for ten successive sessions.
The specificity of the precision grip task in this case here may, on the one hand, be described by the fact that the degrees of freedom for hand and finger movements are unusually constrained. To operate the cylindrical handles in order to direct the surgical tools, and to open and close the grippers at the tool-ends during the surgical task, only sideways and forward/backward movements of the cylindrical handles are possible for manipulating the tool movements in all directions in the three-dimensional workspace. This is radically different from gripping cylindrical objects to lift and/or move them around directly and freely in all possible directions. On the other hand, any surgical task executed with the system, including the simulator task used for training and for which data were collected and exploited in this paper here, imposes further constraints in terms of specific task steps. These need to be executed one after the other in a precise order with the least of effort, the greatest precision, and as swiftly as possible.
Essentially, this experiment consists of four critical steps. Figure 5 shows a snapshot for these four steps from the video sequence captured by the endoscopic camera. As also described in Table 2, to accomplish this four-step precision grip task, the user first activates and moves the distal tool toward the object by manipulating the handles effectively. During this step, movement along the depth in 2D image plane is required. When the tool arrives the object location, the grippers will be opened to grab and lift the object firmly. In the third step, the user has to move the tool with object until the target position, during which only lateral movement is needed. In the last step, the user has to open the grippers and drop the object [22,23,24,25,26].
Since the task here is entirely image-guided, the first task step here is the most difficult, especially for the novice, as it requires moving the tool-tips ahead in depth along an invisible z-axis in the 2D image plane. Even experts still have problems in adjusting to this problem, which can be quite challenging depending on the type of camera and imaging system available to the surgeon on a specific platform. Here in our case, to accomplish the task steps in optimal task time, and with maximal precision (i.e., no tool trajectory corrections, no incidents) requires not only being familiar with the 2D image projection of the three-dimensional task workspace, but also the skillful manipulation of the task-relevant (left or right) handle of the robotic system, as pointed out in Section 2 of our previous work [10,14,15,16,17,18].

3. Results and Analysis

As mentioned in the previous section, this pick-and-drop task was realized with each hand of three users for ten successive sessions, which means the same task has been repeated for sixty times, a total of 60 time results have been recorded, together with 440,412 grip force signals having been collected from the twelve sensor locations on the dominant and non-dominant hands of three users in ten successive task sessions, corresponding to a total of 36701 grip force signals per sensor, as shown in Table 3.
The distinct levels of expertise are consistently reflected by performance task parameters such as average task session times, or the number of task incidents in terms of object drops, misses, and tool-trajectory adjustments during individual performance across task sessions. Left and right system interfaces are identical, and the same task is realized with either hand. The individual grip forces are centrally controlled in the human brain, and aimed at optimizing human motor performance and control [27,28,29,30,31].

3.1. Time Results and Analysis

As discussed above, there are three levels of user factor and the two levels of handness factor. A two-way ANalysis Of VAriance (ANOVA) has been conducted to access the effects of these two factors on the task execution time. As shown in Table 4, there is an extremely significant difference among three users, while there is no significant difference between dominant and non-dominant hands. Moreover, the last row illustrates a significant interaction between the user and handness factors, which means here that the execution times of different users depend on which hand they are using to accomplish the precision task.
Time results across sessions are given in terms of means and their Standard Errors of Mean (SEM) in Table 5. Since the differences between means are difficult to grasp from looking at the tables, we also represent the time results graphically in Figure 6, where execution times by dominant and non-dominant hands are shown separately on top, and total amount of times of both hands for each session are illustrated at bottom, together with total execution times for each hand.
Based on Table 5 and Figure 6, it is clear that the expert and the dominant hand of the intermediate user have a better performance based on time results, with smaller means and variations. Moreover, the novice user is slower and less stable with his dominant hand, which indicates that in this kind of extremely constrained environment of precision task, the novice user was more hesitant in manipulating with his dominant hand.

3.2. Force Results and Analysis

The locally grip forces deployed at each sensor location vary depending on how skilled a user has become in performing the robot-assisted simulator task accurately. In our work, grip force data were collected from three users with distinctly different levels of task expertise. As mentioned at the beginning of this section, there are much more force data than time data. To start with, total forces applied on each sensor during ten sessions by each hand are calculated from the original raw data, and given in Table 6 to offer a general description analysis. According to this table, different force strategies have been applied for different users and hands. Novice applied a significant higher total force compared with other two users, especially with the dominant hand. Moreover, some sensors have not been used during certain maneuvers. For example, the expert did not activate S1, S8 and S11 with his dominant hand, while S2 and S3 were not used with his non-dominant hand.
In the case of force results, there are three levels of user factor, two levels of handness factor and twelve levels of sensor factor. A three-way ANOVA has been conducted to access the effects of these three factors on the force results. As shown in Table 7, there is an extremely significant difference of force employed among three users, between two hands and among twelve sensors, and an extremely significant interaction exists between each two factors. The corresponding means and SEM are provided in Table 8. Moreover, the average forces in different categories are also visualized in Figure 7.

3.3. Functionally Representative Sensors Analysis

In this part, the grip force data from three specific sensors, namely S5, S6, and S7, which are explained and shown in Figure 8 will be further analyzed, as they are functionally representative according to earlier studies [30,31]. Detailedly speaking, S5, which locates on the middle phalanx of the middle finger, mostly contributes to gross force deployment, such as lifting heavy objects, but useless for precision tasks. On the contrary, S7 is on the middle phalanx of the pinky finger and critically important in fine grip force control manipulation studied here. In addition, the last sensor S6, locating on the middle phalanx of the ring finger, is among the least important in grip force control across a variety of tasks.
Based on the former analysis in this section, the most distinct performance happened between the dominant hands of expert and novice. Therefore, a two-way ANOVA has been conducted on the raw grip force data on these two hands of their first and last task sessions, and illustrated in Table 9. Statistical comparison reveals significant interaction between the two-level user and session factors for all three sensors considered here. To deliver a more direct comparison, individual spatio-temporal grip force profiles have been plotted in terms of average peak amplitude foe every one hundred signals for sensors S5, S6 and S7 in Figure 9, together with relative durations of each of the four task steps using the colored lines.
One major difference between skilled or proficient operators and beginners concerns proportional gross grip force deployed by the middle finger. Novice operators deploy way too much unnecessary, task-irrelevant gross grip force, while the expert has learnt to skillfully minimize those [18]. On the other hand, precision grip forces, which are mostly deployed by the small finger and are particularly important in surgical tasks are generally insufficiently deployed by novices [18]. The ring finger plays no major role in grip force control, and the differences between ring finger grip force profiles of novices and experts can be expected to be minimal.
The spatio-temporal profile analysis here shows that the novice takes more than twice as long than the expert to accomplish the task, but at the end he scores a 30% time gain, indicating a considerable temporal training effect. This effect concerns mostly the first critical step in Figure 9 of the pick-and-drop task and becomes clear only under the light of the specific analysis provided here. The functional interpretation of this effect relates to the specificity of the tool-movement away from the body in the surgeon’s peri-personal space required by this first task step. In the specific image-guided task-user system, grip forces and hand movements are constrained by the limited degrees of freedom of the robotic system on the one hand, and the perceptual recovery of physically missing depth information [18], here along a virtual z-axis in the 2D image plane, is necessary. Such difficulty results in longer task times and imprecise tool-movements, as shown in our previous work [10,14,15,16,17,18].

4. Discussion

In this paper, we conducted spatio-temporal statistical analysis on the grip force data collected from three different individuals wearing wireless wearable sensor gloves while manipulating on an endoscope platform to accomplish a four-step pick-and-drop precision task. Following our results, as we could see in last sections, different users apply significant different grip force strategy based on their level of training and task expertise. As grip force monitoring can be run in real time during task execution, which could generate useful information for research on clinical decision making. For example, it could help prevent risks in robot assisted surgery [32], where excessive grip force may cause tissue damage. In future, if we have the opportunity to acquire more data, especially from the novice, the further study of the training effect and the force pattern development could be conducted, which is hopefully to deliver insight to junior surgeons training [33], exoskeletons developing [34], rehabilitation robot design [35] and tactile internet implementation [36,37,38]. Moreover, we are building a new glove system with more advanced grip sensors [39], which are smaller and more flexible. Additionally, there is a clear need for more research on dynamic grip force measures that takes into account the hand-and- wrist complex in dynamic force measurements [40,41]. The hand-and-wrist complex is particularly important in laparoscopic and endoscopic surgical tasks such as the one in this study here. The angle of hand-wrist movements directly influences hand and finger grip forces as a function of the diameter of the tool being grasped and manipulated. The grip forces then also depend on hand size [42]. Further insights into these functional relationships should be useful for the design of handles that require gripping in specific directions, to reduce the effort needed and to minimize surgeons’ fatigue and exertion levels.

Author Contributions

Conceptualization, R.L. and B.D.-L.; methodology, R.L. and B.D.-L.; formal analysis, R.L. and B.D.-L.; investigation, R.L. and B.D.-L.; resources, R.L. and B.D.-L.; data curation, B.D.-L.; writing—original draft preparation, R.L. and B.D.-L.; writing—review and editing, B.D.-L.; visualization, R.L. and B.D.-L.; supervision, B.D.-L.; project administration, B.D.-L.; funding acquisition, B.D.-L. All authors have read and agreed to the published version of the manuscript.

Funding

This research work is part of a project funded by the University of Strasbourg’s Initiative D’EXellence (IDEX).

Data Availability Statement

The raw data from our studies are accessible online by visiting our previously published work as cited.

Acknowledgments

The authors would like to thank Amine M. Falek, Florent Nageotte and Philippe Zanne, for hardware development and data acquisition. The support of the CNRS is also gratefully acknowledged.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
WIFIWIreless-FIdelity
RFIDRadio Frequency Identification
GPSGlobal Positioning System
STRASSingle access and Transluminal Robotic Assistant for Surgeons
FSRForce Sensitive Resistors
ANOVAANanalysis Of VAriance
SEMStandard Errors of Mean

References

  1. Yaqoob, I.; Hashem, I.A.T.; Mehmood, Y.; Gani, A.; Mokhtar, S.; Guizani, S. Enabling communication technologies for smart cities. IEEE Commun. Mag. 2017, 55, 112–120. [Google Scholar] [CrossRef]
  2. Dermibas, M. Wireless sensor networks for monitoring of large public buildings. Comput. Netw. 2005, 46, 605–634. [Google Scholar]
  3. Tulenkov, A.; Parkhomenko, A.; Sokolyanskii, A.; Stepanenko, A.; Zalyubovskiy, Y. The features of wireless technologies application for Smart House systems. In Proceedings of the 2018 IEEE 4th International Symposium on Wireless Systems within the International Conferences on Intelligent Data Acquisition and Advanced Computing Systems (IDAACS-SWS), Lviv, Ukraine, 20–21 September 2018; pp. 1–5. [Google Scholar]
  4. Bi, Z.; Keoleian, G.A.; Lin, Z.; Moore, M.R.; Chen, K.; Song, L.; Zhao, Z. Life cycle assessment and tempo-spatial optimization of deploying dynamic wireless charging technology for electric cars. Transp. Res. Part C Emerg. Technol. 2019, 100, 53–67. [Google Scholar] [CrossRef]
  5. Morris, J.; Mueller, J.; Jones, M.L.; Lippencott, B. Wireless Technology Use and Disability: Results from a National Survey. 2014. Available online: http://scholarworks.csun.edu/handle/10211.3/121967 (accessed on 20 April 2021).
  6. Panicker, J.G.; Azman, M.; Kashyap, R. A LoRa wireless mesh network for wide-area animal tracking. In Proceedings of the 2019 IEEE International Conference on Electrical, Computer and Communication Technologies (ICECCT), Coimbatore, India, 20–22 February 2019; pp. 1–5. [Google Scholar]
  7. Pantelopoulos, A.; Bourbakis, N.G. A survey on wearable sensor-based systems for health monitoring and prognosis. IEEE Trans. Syst. Man, Cybern. Part C 2009, 40, 1–12. [Google Scholar] [CrossRef] [Green Version]
  8. Kim, J.; Campbell, A.S.; de Ávila, B.E.F.; Wang, J. Wearable biosensors for healthcare monitoring. Nat. Biotechnol. 2019, 37, 389–406. [Google Scholar] [CrossRef]
  9. Arefin, M.; Redouté, J.M.; Yuce, M. Wireless biosensors for POC medical applications. In Medical Biosensors for Point of Care (POC) Applications; Elsevier: Amsterdam, The Netherlands, 2017; pp. 151–180. [Google Scholar]
  10. Dresp-Langley, B. Wearable Sensors for Individual Grip Force Profiling. arXiv 2020, arXiv:2011.05863. [Google Scholar]
  11. Pylatiuk, C.; Kargov, A.; Schulz, S.; Döderlein, L. Distribution of grip force in three different functional prehension patterns. J. Med Eng. Technol. 2006, 30, 176–182. [Google Scholar] [CrossRef]
  12. Leyk, D.; Gorges, W.; Ridder, D.; Wunderlich, M.; Rüther, T.; Sievert, A.; Essfeld, D. Hand-grip strength of young men, women and highly trained female athletes. Eur. J. Appl. Physiol. 2007, 99, 415–421. [Google Scholar] [CrossRef]
  13. Vieluf, S.; Godde, B.; Reuter, E.M.; Voelcker-Rehage, C. Age-related differences in finger force control are characterized by reduced force production. Exp. Brain Res. 2013, 224, 107–117. [Google Scholar] [CrossRef]
  14. Batmaz, A.U.; Falek, M.A.; Zorn, L.; Nageotte, F.; Zanne, P.; De Mathelin, M.; Dresp-Langley, B. Novice and expert haptic behaviours while using a robot controlled surgery system. In Proceedings of the 2017 13th IASTED International Conference on Biomedical Engineering (BioMed), Innsbruck, Austria, 20–21 February 2017; pp. 94–99. [Google Scholar]
  15. Dresp-Langley, B.; Nageotte, F.; Zanne, P.; Mathelin, M.D. Correlating grip force signals from multiple sensors highlights prehensile control strategies in a complex task-user system. Bioengineering 2020, 7, 143. [Google Scholar] [CrossRef]
  16. de Mathelin, M.; Nageotte, F.; Zanne, P.; Dresp-Langley, B. Sensors for expert grip force profiling: Towards benchmarking manual control of a robotic device for surgical tool movements. Sensors 2019, 19, 4575. [Google Scholar] [CrossRef] [Green Version]
  17. Liu, R.; Nageotte, F.; Zanne, P.; de Mathelin, M.; Dresp-Langley, B. Wearable Sensors for Spatio-Temporal Grip Force Profiling. arXiv 2021, arXiv:2101.06479. [Google Scholar]
  18. Liu, R.; Nageotte, F.; Zanne, P.; Mathelin, M.d.; Dresp-Langley, B. Wearable Wireless Biosensors for Spatiotemporal Grip Force Profiling in Real Time; Engineering Proceedings. Multidisciplinary Digital Publishing Institute: Basel, Switzerland, 2020; Volume 2, p. 45. [Google Scholar]
  19. De Donno, A.; Zorn, L.; Zanne, P.; Nageotte, F.; de Mathelin, M. Introducing STRAS: A new flexible robotic system for minimally invasive surgery. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 1213–1220. [Google Scholar]
  20. Mahmood, A.; Shi, K.; Khatoon, S.; Xiao, M. Data mining techniques for wireless sensor networks: A survey. Int. J. Distrib. Sens. Netw. 2013, 9, 406316. [Google Scholar] [CrossRef]
  21. Zorn, L.; Nageotte, F.; Zanne, P.; Legner, A.; Dallemagne, B.; Marescaux, J.; de Mathelin, M. A novel telemanipulated robotic assistant for surgical endoscopy: Preclinical application to ESD. IEEE Trans. Biomed. Eng. 2017, 65, 797–808. [Google Scholar] [CrossRef] [Green Version]
  22. Batmaz, A.U.; de Mathelin, M.; Dresp-Langley, B. Getting nowhere fast: Trade-off between speed and precision in training to execute image-guided hand-tool movements. BMC Psychol. 2016, 4, 1–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Dresp, B. Local brightness mechanisms sketch out surfaces but do not fill them in: Psychophysical evidence in the Kanizsa square. Percept. Psychophys. 1992, 52, 562–570. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  24. Batmaz, A.U.; de Mathelin, M.; Dresp-Langley, B. Seeing virtual while acting real: Visual display and strategy effects on the time and precision of eye-hand coordination. PLoS ONE 2017, 12, e0183789. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Batmaz, A.U.; de Mathelin, M.; Dresp-Langley, B. Effects of 2D and 3D image views on hand movement trajectories in the surgeon’s peri-personal space in a computer controlled simulator environment. Cogent Med. 2018, 5, 1426232. [Google Scholar] [CrossRef]
  26. Dresp-Langley, B. Towards Expert-Based Speed–Precision Control in Early Simulator Training for Novice Surgeons. Information 2018, 9, 316. [Google Scholar] [CrossRef] [Green Version]
  27. Johansson, R.S.; Cole, K.J. Sensory-motor coordination during grasping and manipulative actions. Curr. Opin. Neurobiol. 1992, 2, 815–823. [Google Scholar] [CrossRef]
  28. Zatsiorsky, V.M.; Latash, M.L. Multifinger prehension: An overview. J. Mot. Behav. 2008, 40, 446–476. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  29. Kinoshita, H.; Murase, T.; Bandou, T. Grip posture and forces during holding cylindrical objects with circular grips. Ergonomics 1996, 39, 1163–1176. [Google Scholar] [CrossRef]
  30. Kjnoshita, H.; Kawai, S.; Ikuta, K. Contributions and co-ordination of individual fingers in multiple finger prehension. Ergonomics 1995, 38, 1212–1230. [Google Scholar] [CrossRef] [PubMed]
  31. Latash, M.L.; Zatsiorsky, V.M. Multi-finger prehension: Control of a redundant mechanical system. In Progress in Motor Control; Springer: Berlin/Heidelberg, Germany, 2009; pp. 597–618. [Google Scholar]
  32. Abiri, A.; Pensa, J.; Tao, A.; Ma, J.; Juo, Y.Y.; Askari, S.J.; Bisley, J.; Rosen, J.; Dutson, E.P.; Grundfest, W.S. Multi-modal haptic feedback for grip force reduction in robotic surgery. Sci. Rep. 2019, 9, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  33. Cundy, T.P.; Thangaraj, E.; Rafii-Tari, H.; Payne, C.J.; Azzie, G.; Sodergren, M.H.; Yang, G.Z.; Darzi, A. Force-Sensing Enhanced Simulation Environment (ForSense) for laparoscopic surgery training and assessment. Surgery 2015, 157, 723–731. [Google Scholar] [CrossRef] [PubMed]
  34. Glowinski, S.; Obst, M.; Majdanik, S.; Potocka-Banaś, B. Dynamic Model of a Humanoid Exoskeleton of a Lower Limb with Hydraulic Actuators. Sensors 2021, 21, 3432. [Google Scholar] [CrossRef]
  35. Chen, M.; Ho, S.; Zhou, H.F.; Pang, P.; Hu, X.; Ng, D.; Tong, K. Interactive rehabilitation robot for hand function training. In Proceedings of the 2009 IEEE International Conference on Rehabilitation Robotics, Kyoto, Japan, 23–26 June 2009; pp. 777–780. [Google Scholar]
  36. Dohler, M.; Mahmoodi, T.; Lema, M.A.; Condoluci, M.; Sardis, F.; Antonakoglou, K.; Aghvami, H. Internet of skills, where robotics meets AI, 5G and the Tactile Internet. In Proceedings of the 2017 European Conference on Networks and Communications (EuCNC), Oulu, Finland, 12–15 June 2017; pp. 1–5. [Google Scholar]
  37. Fitzek, F.H.; Li, S.C.; Speidel, S.; Strufe, T.; Simsek, M.; Reisslein, M. Tactile Internet: With Human-in-the-Loop; Academic Press: Cambridge, MA, USA, 2021. [Google Scholar]
  38. Gupta, R.; Tanwar, S.; Tyagi, S.; Kumar, N. Tactile internet and its applications in 5g era: A comprehensive review. Int. J. Commun. Syst. 2019, 32, e3981. [Google Scholar] [CrossRef]
  39. Sundaram, S.; Kellnhofer, P.; Li, Y.; Zhu, J.Y.; Torralba, A.; Matusik, W. Learning the signatures of the human grasp using a scalable tactile glove. Nature 2019, 569, 698–702. [Google Scholar] [CrossRef]
  40. Morse, J.L.; Jung, M.C.; Bashford, G.R.; Hallbeck, M.S. Maximal dynamic grip force and wrist torque: The effects of gender, exertion direction, angular velocity, and wrist angle. Appl. Ergon. 2006, 37, 737–742. [Google Scholar] [CrossRef]
  41. Dumont, C.E.; Popovic, M.R.; Keller, T.; Sheikh, R. Dynamic force-sharing in multi-digit task. Clin. Biomech. 2006, 21, 138–146. [Google Scholar] [CrossRef]
  42. Edgren, C.S.; Radwin, R.G.; Irwin, C.B. Grip force vectors for varying handle diameters and hand sizes. Hum. Factors 2004, 46, 244–251. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Sensor locations [17].
Figure 1. Sensor locations [17].
Electronics 10 01391 g001
Figure 2. STRAS endoscope manipulation.
Figure 2. STRAS endoscope manipulation.
Electronics 10 01391 g002
Figure 3. Glove surface.
Figure 3. Glove surface.
Electronics 10 01391 g003
Figure 4. Diagram of data acquisition system [16].
Figure 4. Diagram of data acquisition system [16].
Electronics 10 01391 g004
Figure 5. Snapshot views of the four successive steps [15].
Figure 5. Snapshot views of the four successive steps [15].
Electronics 10 01391 g005
Figure 6. Execution time for three users with dominant and non-dominant hands among ten sessions.
Figure 6. Execution time for three users with dominant and non-dominant hands among ten sessions.
Electronics 10 01391 g006
Figure 7. Force results for three users with dominant and non-dominant hands among ten sessions
Figure 7. Force results for three users with dominant and non-dominant hands among ten sessions
Electronics 10 01391 g007
Figure 8. Three functionally representative sensors [17].
Figure 8. Three functionally representative sensors [17].
Electronics 10 01391 g008
Figure 9. Average peak of three sensors on dominant hand of expert and novice for the first and last session [18]
Figure 9. Average peak of three sensors on dominant hand of expert and novice for the first and last session [18]
Electronics 10 01391 g009
Table 1. User Information
Table 1. User Information
UserDominant Hand
Proficient expertLeft
Intermediate userRight
Complete noviceRight
Table 2. Four-step pick-and-drop task.
Table 2. Four-step pick-and-drop task.
StepDescription
1Activate and move tool towards object location
2Open and close grippers to grasp and lift object
3Move tool with object to target location
4Open grippers to drop object in box
Table 3. Number of grip force signals for each sensor.
Table 3. Number of grip force signals for each sensor.
UserDominantNon-Dominant
Expert44425244
Intermediate user59746764
Novice77806497
Table 4. Results from two-way ANOVA on time data as a function of user and handness.
Table 4. Results from two-way ANOVA on time data as a function of user and handness.
Source of VariationDegree of FreedomFP
User215.65<0.001
Handness10.09Not significant
User × Handness24.13<0.05
Table 5. Mean and SEM of task execution times across sessions
Table 5. Mean and SEM of task execution times across sessions
UserHandnessMean (s)SEM (s)
ExpertDominant8.880.36
Non-dominant10.490.49
IntermediateDominant11.950.49
Non-dominant13.530.66
NoviceDominant15.561.60
Non-dominant12.990.75
Table 6. Total force across sessions for each sensor (V).
Table 6. Total force across sessions for each sensor (V).
SensorExpert_DExpert_NInterm_DInterm_NNovice_DNovice_N
103.4000.0000
26.2309.1658.67193.23447.29
310.960005328.260
49.0346.9037.500.606.070
5437.131811.112901.79283.755946.811926.19
62009.061895.473327.503520.783915.376910.31
72607.76115.7160.633638.08664.063420.98
80487.501064.1538.275022.851512.46
92.50534.02786.7408838.140.66
102106.27900.44489.66499.175062.523246.43
1103966.70006842.420
125.152242.131593.1106585.592.71
Total7194.1012,003.3910,270.238039.3348,405.3117,467.03
Table 7. Force results from three-way ANOVA on force data as a function of user, handness and sensor.
Table 7. Force results from three-way ANOVA on force data as a function of user, handness and sensor.
Source of VariationDegree of FreedomFP
User2107.72<0.001
Handness138.27<0.001
Sensor1141.88<0.001
User × Handness247.83<0.001
User × Sensor227.41<0.001
Handness × Sensor119.85<0.001
Table 8. Force Mean and SEM for all sensors
Table 8. Force Mean and SEM for all sensors
FactorLevelMean (mV)SEM (mV)
UserExpert161.9615.85
Intermediate120.2913.10
Novice370.5525.49
HandnessDominant263.7417.93
Non-dominant171.4614.48
SensorS10.120.07
S217.603.60
S3115.7034.08
S43.050.54
S5337.6332.69
S6575.0430.21
S7294.6133.76
S8193.4031.19
S9223.8254.57
S10320.9634.51
S11267.9956.54
S12261.2844.41
Table 9. Two-way ANOVA on three sensors on dominant hand of expert and novice for the first and last session (mV).
Table 9. Two-way ANOVA on three sensors on dominant hand of expert and novice for the first and last session (mV).
SensorSessionExpert (Mean/SEM)Novice (Mean/SEM)Interaction
S5First240.37 /4.56790.00 /3.02F(1,3120) = 169.39;
Last48.32 /0.36691.72 /2.19p < 0.001
S6First575.63 /4.51504.12 /2.42F(1,3120) = 394.24;
Last473.98 /5.17540.30 /2.23p < 0.001
S7First594.02 /3.41110.82 /0.75F(1,3120) = 260.72;
Last608.51 /2.3872.90 /0.61p < 0.001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Liu, R.; Dresp-Langley, B. Making Sense of Complex Sensor Data Streams. Electronics 2021, 10, 1391. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics10121391

AMA Style

Liu R, Dresp-Langley B. Making Sense of Complex Sensor Data Streams. Electronics. 2021; 10(12):1391. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics10121391

Chicago/Turabian Style

Liu, Rongrong, and Birgitta Dresp-Langley. 2021. "Making Sense of Complex Sensor Data Streams" Electronics 10, no. 12: 1391. https://0-doi-org.brum.beds.ac.uk/10.3390/electronics10121391

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop