Next Article in Journal / Special Issue
A Near-Zero Power Triboelectric Wake-Up System for Autonomous Beaufort Scale of Wind Force Monitoring
Previous Article in Journal / Special Issue
Surface Engineering for Enhanced Triboelectric Nanogenerator
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Progress in the Triboelectric Human–Machine Interfaces (HMIs)-Moving from Smart Gloves to AI/Haptic Enabled HMI in the 5G/IoT Era

1
Department of Electrical & Computer Engineering, National University of Singapore, Singapore 117576, Singapore
2
Center for Intelligent Sensors and MEMS (CISM), National University of Singapore, Singapore 117608, Singapore
3
Singapore Institute of Manufacturing Technology and National University of Singapore (SIMTech-NUS) Joint Laboratory on Large-Area Flexible Hybrid Electronics, National University of Singapore, Singapore 117576, Singapore
4
NUS Graduate School—Integrative Sciences and Engineering Program (ISEP), National University of Singapore, Singapore 119077, Singapore
*
Author to whom correspondence should be addressed.
Submission received: 28 August 2021 / Revised: 12 September 2021 / Accepted: 15 September 2021 / Published: 19 September 2021
(This article belongs to the Special Issue Recent Advances in Nanogenerators)

Abstract

:
Entering the 5G and internet of things (IoT) era, human–machine interfaces (HMIs) capable of providing humans with more intuitive interaction with the digitalized world have experienced a flourishing development in the past few years. Although the advanced sensing techniques based on complementary metal-oxide-semiconductor (CMOS) or microelectromechanical system (MEMS) solutions, e.g., camera, microphone, inertial measurement unit (IMU), etc., and flexible solutions, e.g., stretchable conductor, optical fiber, etc., have been widely utilized as sensing components for wearable/non-wearable HMIs development, the relatively high-power consumption of these sensors remains a concern, especially for wearable/portable scenarios. Recent progress on triboelectric nanogenerator (TENG) self-powered sensors provides a new possibility for realizing low-power/self-sustainable HMIs by directly converting biomechanical energies into valuable sensory information. Leveraging the advantages of wide material choices and diversified structural design, TENGs have been successfully developed into various forms of HMIs, including glove, glasses, touchpad, exoskeleton, electronic skin, etc., for sundry applications, e.g., collaborative operation, personal healthcare, robot perception, smart home, etc. With the evolving artificial intelligence (AI) and haptic feedback technologies, more advanced HMIs could be realized towards intelligent and immersive human–machine interactions. Hence, in this review, we systematically introduce the current TENG HMIs in the aspects of different application scenarios, i.e., wearable, robot-related and smart home, and prospective future development enabled by the AI/haptic-feedback technology. Discussion on implementing self-sustainable/zero-power/passive HMIs in this 5G/IoT era and our perspectives are also provided.

1. Introduction

With the rapid development of the Internet of things (IoT) and 5G communication technology in recent years, human–machine interfaces (HMIs) have gradually evolved from traditional computer peripherals, e.g., keyboard, mouse, and joystick as illustrated in Figure 1, to a more intuitive interface that directly collects human’s original signals [1,2,3], such as voice and basic body motions, providing users with a more intuitive and easier interaction with computers and intelligent robots in the applications of healthcare, rehabilitation, industrial automation, smart home, virtual reality (VR) game control, etc. [3,4,5]. Current commercialized advanced HMIs include non-wearable solutions based on voice and vision recognition [6,7,8], and wearable solutions that use inertial measurement units (IMUs) [9,10], i.e., microelectromechanical system (MEMS) based accelerometers and gyroscopes, for the somatosensory information collection. However, these kinds of HMIs require complicated sensory information and high-performance signal collection/processing units, thus resulting in great power consumption in the system. Additionally, the MEMS-based sensing components are commonly bulky and rigid [11], making them relatively incompatible for wearable scenarios. For these issues, stretchable and flexible HMI solutions with minimalistic sensor design emerge recently, including sensors based on the sensing mechanisms of resistive [12,13,14,15,16,17,18,19,20], capacitive [21,22,23,24], and optical fiber [25,26,27], etc. Though the signal complexity and processing cost are greatly reduced to save energy and enhance the timeliness of the system, these sensors themselves still require a small amount of driving energy, and the power consumption may be large considering the massive number of sensor nodes in a sensor network. Moreover, repetitive charging is also annoying, especially for wearable or portable HMIs. So to address these issues, self-powered sensors using nanogenerator technologies of piezoelectric [28,29], triboelectric [30,31] have been developed frequently, to build low-power/self-sustainable human–machine interactive systems.
Triboelectric nanogenerator (TENG), first reported by Prof. Zhong Lin Wang in 2012 [48], has been widely developed as energy harvesters for mechanical energy scavenging, ranging from natural wind energy [49,50,51,52,53], blue energy [54,55,56,57,58,59] to the human body’s biomechanical energy [37,43,60,61,62,63,64], thanks to its exceptional merits of good output performance, wide material choices, good scalability, simple fabrication and low cost. Considering that the kinetic energies generated from the daily activities of humans, e.g., hand motion, joint rotation, foot motion, etc., contain valuable information of the corresponding motions, so such kinds of motion-induced energies could be collected by the nanogenerators towards a fully self-powered sensing strategy for human–machine interactions as well as health status monitoring purpose [65]. Compared with piezoelectric-based sensors that are commonly difficult for design customization due to the limitation of materials and complexity of the fabrication process [66,67,68,69], TENGs show the advantages of wide choices of stretchable and flexible materials, e.g., fabric, silicone rubber, plastic thin film, etc., and versatile operation modes, i.e., contact-separation mode, liner-sliding mode, single electrode mode and freestanding triboelectric-layer mode [70,71]. Therefore, TENGs have been successfully designed into various structures for different interactions (Figure 1), such as touchpad interface [35,36,37,38,41,72], auditory-based interface [39,73,74], 3D motion manipulator [33,40,42], etc., and can be further designed as self-powered wearable HMIs, e.g., electronic skin (e-skin) [43,75,76,77], data glove [32,44], wearable band [45,46], intelligent sock [78,79], breath-driven mask [80], etc., for advanced robotic manipulation, IoT control, VR game control/rehabilitation, personal identification and advanced sport analysis, showing the wide application prospects of triboelectric in HMIs area.
The new era of artificial intelligence (AI) provides a new possibility to enhance the functionalities of HMIs via machine learning (ML) enabled data analytics [81,82], where the subtle features hidden behind the real-time signal spectrum could be extracted automatically towards more advanced human–machine interactions, e.g., gesture recognition [83], voice recognition [84], pose estimation [85], personal identification [86], object classification [87,88], etc. Thus, due to these benefits, combining TENGs with ML-enabled analytics reveals a promising research direction for the development of HMIs with enhanced intelligence and low power consumption, which has attracted great attention in the past few years [73,89,90,91,92,93]. Besides, integrating TENGs with other sensing mechanisms, such as piezoresistive [76,94], pyroelectric [95,96,97,98,99], etc., to implement a multimodal sensing system capable of perceiving different sensory information simultaneously, i.e., tactile, strain, temperature, etc., is also a good strategy to broaden the applications of TENG-based HMIs. In addition, to build a complete human–machine interaction system, the haptic feedback function enabled by mechanical actuators [100,101], i.e., wire actuator, pneumatic actuator, vibration motor, etc., is indispensable to provide users with a more immersive experience for specific applications, e.g., robotic collaborative operation [102,103], VR game manipulation [104,105,106,107,108], etc., and deserves to be further integrated into current TENG-based HMI solutions to boost up the capability of information interpretation. Furthermore, by integrating self-powered TENG sensors with energy harvesters or passive wireless techniques, e.g., near-field communication (NFC) [109], surface acoustic wave (SAW) [110], etc., battery-free systems can also be achieved towards fully self-sustainable/zero-power HMI terminals under the future IoT framework.
Herein, in this review, we systematically introduce the recent progress in the TENG-based HMIs from the following sections: (1) glove-based HMIs for advanced manipulation, gesture recognition and tactile sensing; (2) wearable HMIs for other biomechanical signal collection, e.g., eye motion, facial expressions, voice, posture, etc.; self-powered HMIs for (3) robotic perception and (4) smart home applications; (5) ML-enabled intelligent HMIs, and the possible future research direction enabled by the (6) haptic-feedback technology and towards (7) self-sustainable/zero-power/passive HMI terminals. In the end, current issues and the potential development trends for TENG-based HMIs are also provided for future research in this 5G/IoT era.

2. Glove-Based HMIs

As we all know, nearly all the conventional HMIs are inseparable from the fine operation of our hands: the keyboard needs ten-finger tapping, the mouse needs to be moved and clicked by the hand, and the gamepad needs the finger to press the button and operate the joystick, etc. Compared with the proportion of nerve areas in other parts of the body, the sensory and motor nerve areas hidden in hands and fingers are huge [111,112], making each finger can bend and stretch independently to a certain extent, and can also expand and shrink the lateral intervals. This degree of flexibility enables them to complete quite complex gestures to achieve diversified operations. A glove, as a common wearable item, is quite suitable to be further designed into glove-based HMI by integrating sensors for highly sensitive finger motion tracking. One of the most mature sensing techniques for data gloves is to use IMUs [113,114], consisting of accelerometers and gyroscopes. However, the rigid sensing elements, complex data format and relatively high energy consumption remain concerns. Other flexible solutions based on resistive [13,14,15,115,116,117], optical fiber [25,27], etc., also reveal their own drawbacks, e.g., temperature effect, limited sensing range, etc. Thus, the emerging sensing technology based on triboelectric, with the advantages of diversified material selection, extremely simplified design and self-powered characteristics, provides a new research direction for designing the next generation of low-power data gloves [118,119,120,121,122,123,124].
He et al. proposed a glove-based HMI using TENG textile as the sensing unit with the minimalist design as shown in Figure 2a [125]. The conducting polymer made of poly (3,4-ethylene dioxythiophene): poly (styrene sulfonate) (PEDOT: PSS), due to its advantages of good physical and chemical stability, was chosen as the coating material to fabricate the highly stretchable electrodes as well as the positive triboelectric parts in this fabric-based contact-separation-mode TENG sensor. Additionally, a layer of silicone rubber thin film was coated on the textile glove serving as the negative triboelectric contacting layer. There are two kinds of sensor configurations in this design, where the arch-shaped sensor is utilized to measure the finger bending motions, while the sensors mounted at the sides of the fingers are to detect the contacts between the index finger and its adjacent fingers. Based on these two configurations, the movement of the index finger in all directions can be recognized to realize the intuitive control of aircraft minigame in cyberspace, and the car/drone movement in the real 3D space. Moreover, the function of a mouse can also be mimicked by this glove-based HMI for web surfing and alphabet writing, providing a simpler, power-compatible interactive method in daily life. However, for the actual applications of such a kind of textile glove, the humidity in the environment or the sweat generated from the human body may negatively affect the triboelectric output [126,127]. So to solve this issue, a more advanced design with superhydrophobicity of the triboelectric textile for performance improvement enabled by a facile carbon nanotubes/thermoplastic elastomer (CNTs/TPE) coating approach was reported by Wen et al. as shown in Figure 2b [128]. The anti-sweat capability enables the TENG sensor to remain at 80% voltage output even after a 1 h exercise. By leveraging machine learning technology, complex gestures’ recognition function could be realized with the minimal sweat effect, which was successfully demonstrated to be further utilized for real-time VR/AR control applications, i.e., shooting games, baseball pitching and floral arrangement.
The abovementioned two works are all based on the arch-shaped TENG strain sensors [129,130], meaning that a large air space needs to be reserved between two contact layers and a relatively limited sensing range: obvious sensor response only occurs at the moment of contact and separation of the friction layers. For this problem, Zhou et al. proposed a TENG strain sensor based on a unique yarn structure as illustrated in Figure 2c [131]. The core of the sensing unit is composed of a conductive yarn coiled around a rubber microfiber, with the entire body sheathed by a polydimethylsiloxane (PDMS) sleeve. Varying degrees of deformation will result in a constant and continuous change in the contact area between the PDMS sleeve and the coiled conductive yarn, enabling the sensor with good linearity and sensitivity within a large strain range (20–90%). After integrating a wireless printed circuit board (PCB) for signal collection, processing and transmission, a wearable sign-to-speech translation system could be achieved with the multi-class support vector machine (SVM) machine-learning algorithm, whose overall accuracy could be maintained higher than 98.63% with fast response time (<1 s), showing a cost-effective approach for assisted communication between signers and non-signers, as well as the prospect of TENG-based HMI in the field of healthcare.
To further enrich the function of the glove-based HMI, Zhu et al. integrated TENG-based finger bending sensors, palm sliding sensors, and also piezoelectric mechanical stimulators onto one 3D-printed glove which realizes the multidirectional bending sensing, sliding event detecting, as well as haptic stimulation simultaneously for augmented AR/VR experiences (Figure 2d) [132]. By attaching multiple elastomer-based TENG tactile sensors onto different joints of each finger, the perception of motions of each phalanx with multiple degrees of freedoms (DOFs) could be achieved, where these sensors provide more useful subtle features regarding the finger bending compared with other common solutions that installing one sensor node per finger [136,137,138,139]. In the meanwhile, the sensory information of the normal and shear force could also be collected by the TENG palm sliding sensor to realize diversified sensing, especially for grasping-related tasks, revealing a new possibility of multifunctional HMI solution based on TENGs for VR entertainment and training applications.
Though these works based on the contact-separation mode TENG strain sensors have shown the great potential of TENG enabled glove-based HMI as effective gesture interfaces, limitations are still there and need to be further improved. One of the main issues is that most of these studies use the generated output peaks’ amplitudes to judge the bending degree [120,140], which is unstable and could be easily influenced by the varying environmental factors, e.g., humidity, etc. Due to the fast decline of the stimuli-induced electrical states caused by electrostatic equilibrium, the generated pulse-like signals can only reflect the momentary motion of finger bending. Other valuable information, e.g., bending speed, the degree at a certain moment during the entire bending movement, etc., will be lost in such kind of signal format. One possible solution is to use grating-sliding mode TENG for better quantifying the bending degree/speed by counting the generated output peak number. A joint motion TENG quantization sensor for the robotic collaborative operation application was developed by Pu et al. as illustrated in Figure 2e [133]. When the slider is driven by the finger to slid forward/backward across the well-designed interdigitated electrodes, the alternating output signal in a series of periodic narrow pulses will be generated, where the bending degree and speed could be easily distinguished based on the pulse number and width respectively. The minimum resolution of the TENG joint sensor can reach 3.8° and could be further improved with finer grating segments. With such kind of measurement method, the real-time continuous robot bending control can be realized with high precision, demonstrating a more stable and reliable solution of TENG-based strain/displacement sensor with strong tolerance to environmental interference.
Besides the strain sensors for the finger bending monitoring, the tactile sensory information also plays a key role in glove-based HMI for mimicking the biological perception system of human skin to provide a more complete anthropomorphic feedback in contact force detecting, roughness recognition, as well as temperature sensing applications [141,142,143]. Due to the high sensitivity and fast response to tiny stimuli, TENGs have been frequently investigated to simulate the fast adapting (FA) sensory cells of the skin that respond to dynamic touch and vibration [144,145] and can be further integrated with other traditional sensors based on resistance or capacitance that mimic the slow adapting (SA) cells due to the signal maintenance capability, to form a multimodal tactile sensory system [76,146]. As depicted in Figure 2f, an electronic skin that simultaneously perceives the lateral and vertical movements of the fingertip during grasping tasks was reported by Chen et al. [94]. In this design, the carbon nanotube-poly-dimethylsiloxane (CNT-PDMS) electrode layer works as a freestanding TENG sensor for roughness differentiation according to the generated output peaks when the device slides across the object surface, where rougher objects commonly contribute more peaks during the sliding motion. The porous CNT-PDMS layer serves as the static pressure sensor based on the mechanism of piezoresistive, making the tactile HMI also capable of real-time status monitoring, e.g., holding or releasing, etc., during the grasping/tapping operation process. A similar but more advanced multifunctional tactile sensor was proposed by Wang et al. as shown in Figure 2g [134]. The main difference between this device and previous tactile designs lies in the functionality of the TENG layer. The electrification layer made of hydrophobic polytetrafluoroethylene (PTFE) film here is utilized for material identification based on the varying electron affinity ability of different materials [147], and 10 different flat materials have been successfully demonstrated to be inferred with a simple lookup table algorithm. Moreover, apart from the pressure sensing ability brought by the piezoresistive property, the sponge-like graphene/polydimethylsiloxane (PDMS) composite also shows thermoelectric effects, which can be used for detecting the temperature of contacted objects with a high resolution of 1 kelvin, reveling the possibility to further enhance the diversity of the functions for TENG-integrated tactile HMIs.
Limited by the pulse-like output signals as mentioned above, the TENG-based tactile sensing units developed in most works were used for dynamic touch/vibration sensing and cannot be used to detect the continuous force variation due to the fast shift of electrical states. In this case, combining with resistive/capacitive sensors becomes necessary, and a fully self-powered sensing system could not be achieved. Actually, the problem of the electrical state shifts could be suppressed by using a high-impedance readout circuit to maintain the triboelectric output [148,149]. However, a complicated sensing system with an amplifying circuit for small current collection is needed. To address this issue, Dong et al. proposed a strategy by using the robust nanophotonic aluminum nitride (AlN) modulators to read the TENG output, as shown in Figure 2h [135]. The TENG sensor can work in the open-circuit condition with negligible charge flows due to the electrically capacitive nature of AlN modulators, and the stimuli-induced triboelectric voltage can be transformed to AlN modulators’ optical output based on the electro-optic Pockels effect. Owing to the negligible charge flow and the high-speed optical information carrier, continuous force sensing with good linearity and stability was successfully achieved based on the TENG tactile sensor with nanophotonic readout circuits, demonstrating the potential to replace resistive/capacitive sensors as SA sensing units in anthropomorphic skin, and to realize a fully TENG-based self-powered multifunctional tactile HMI.

3. Other Wearable HMIs

In addition to the glove-based HMIs, due to the advantages of high output, lightweight, high flexibility/stretchability and applicability of various structural designs [150,151], TENGs have also been developed into sundry biomechanical sensors for other biosignal collection, e.g., eye motion, facial expressions, voice, posture, etc., as self-powered wearable HMIs to bring great convenience to people [152,153,154], especially those with disabilities, in the era of information and IoT.
Among these mechanical motions, eye blinking has been proven as a new, simple and effective triggering method for handicapped people to realize convenient electrical appliance control for smart healthcare/home purposes [155,156,157]. A TENG-based micromotion sensor for eye blink motion monitoring with high sensitivity was reported by Pu et al. as shown in Figure 3a [158]. This sensor works in single-electrode mode and has a multilayered structure, where a fluorinated ethylene propylene (FEP) thin film and a natural latex thin film serve as the tribo-layers in this device, with an acrylic thin annulus as the spacer to reserve the necessary space for the contact and separation process. This tiny sensor could be easily mounted on the arms of glasses by fixators to capture the mechanical micromotion of the skin around the eyes, and function as an intuitive HMI to control electrical appliances, e.g., table lamp, electric fan and doorbell, via the simple trigger signal generated while eye blinking. By using a wireless module for data transmission, a hands-free typing system based on the input method of adjusting the number of blinks in a period was also successfully demonstrated, which may bring great convenience to our daily life, especially for the disabled, or people whose hands are fully occupied while working. Similarly, a non-attached electrode-dielectric TENG sensor for eye blinking sensing was reported by Anaya et al. as illustrated in Figure 3b [159]. This sensor is fully made of soft materials, i.e., PEDOT:PSS coated conductive textile and silicone rubber, and shows good comfortability when placed on the lateral skin tissue of the eye to detect the orbicularis oculi muscle motion. Due to the near-field transmission of signals based on the non-contact electrostatic induction, the output electrode could be mounted on the temple of the eyeglasses without a wire connected to the main sensor unit, which significantly improves the simplicity of the whole system. By analyzing the trigger signal induced by eye blinking, diversified hands-free human–machine interaction applications, including cursor control, car/drone control, etc., could be realized to assist people with mobility impairment. Additionally, the developed HMI could also be used for eye fatigue monitoring to evaluate the tiredness of a driver and give proper warnings to avoid danger.
Apart from the eye movements, fluctuations induced by the movements of other facial muscles, e.g., masseter muscle, etc., are also good choices to be translated into control command as communication aid HMI for the disabled [160]. Inspired by the frogs’ croaking behavior, Zhou et al. reported a bionic TENG-based sensor for masseter muscle motion monitoring as illustrated in Figure 3c [161]. By imitating the oral structure and acoustic capsule, the flexible PDMS elastomer was made into a sensing membrane and a deformable vibrating membrane, with an air layer as the spacer, to amplify the small fluctuations of the masseter muscle into a significant movement of the vibrating membrane, due to the varying deformations of films with different radiuses under the same volume change. TENG technology was then integrated to convert the vibration of the film into an electrical signal, which could be further utilized in a Morse code communication system, a hands-free typing tool, as well as an intelligent authentication system, to achieve the efficient collaboration between the disabled and the digitalized world.
Another biosignal that can be effectively utilized for the human–machine interaction purpose is the human voice, which has attracted great attention recently due to the rapid development of voice recognition based on artificial intelligence technology [162,163,164]. TENG-based acoustic devices have also been investigated a lot towards self-powered microphones [39,73,74,165,166,167,168]. Although these flexible acoustic devices show good performance and functionality, they are still rigid, which hinders their applications in the wearable device platform that can be comfortably integrated with the human skin. For this issue, Kang et al. developed a skin-attachable microphone with high transparency and adhesion by using hybrid freestanding nanomembranes (NMs) combined with a micropatterned PDMS and a holey PDMS film in a sandwich structure as illustrated in Figure 3d [169]. The holey PDMS film effectively enhances the vibration of the freestanding NM membrane compared with a planar PDMS film, result in a larger output voltage and excellent sensitivity. The time-dependent waveform of a speech recorded by the wearable TENG microphone was also demonstrated, which was very much in line with the original sound waveform, proving the good acoustic sensing capability of the developed microphone especially when attached to a person’s neck. Then a personal voice security system was successfully established to recognize the users’ identity with high accuracy by perceiving the collected voiceprint, showing its potential as HMI for biometric purposes.
In addition, a tactile sensing patch is also a common form of wearable HMI by collecting finger touching/sliding movements to realize simple manipulation commands [174,175,176,177,178,179] Currently, TENG tactile HMIs commonly consist of a large number of sensing pixels, where each pixel is connected to an independent output, resulting in complex readout circuits and output signals [180,181,182,183,184,185,186,187]. In order to simplify the outputs to effectively reduce the difficulty and cost of data collection and processing, some novel solutions with minimalistic electrode design have been demonstrated recently [188,189,190,191,192,193]. A two-dimensional TENG patch with a 5 × 5 pixel matrix for finger trajectory sensing was developed by Chen et al. as shown in Figure 3e [170]. With only four edge electrodes, the accurate position of the finger sliding or touching along the x and y axis of the patch surface could be achieved with a minimum resolution of ~1.6 mm based on the output ratio of two pairs of opposite electrodes, which greatly reduce the total number of output channels for multi-pixel sensing. By utilizing another one-dimensional TENG patch to detect the position along the z-axis, the real-time three-dimensional manipulation of a robotic arm was demonstrated and could be applied in various complicated operations, e.g., welding, handling, spraying, etc. A more advanced design was proposed by Shi et al., as depicted in Figure 3f [171], where a bio-inspired spider-net-coding TENG interface was developed for multi-directional sliding sensing with only one output electrode. By adjusting the electrode widths and positions along with different directions, the output signal patterns could be differentiated according to the relative amplitude and positions of the generated peaks in the time domain and could be further transferred into binary codes for more straightforward signal processing. Such a coding method enables the minimalistic TENG tactile interface with good reliability and robustness, and makes it applicable for diversified human–machine interaction applications, e.g., drone control, security code, etc.
Though some of the abovementioned HMIs have shown the possibility to realize the robotic control by defining specific commands according to the sensor signals, the number of commands is limited by the sensor design and data format, and as a result, diversified operations for multi-tasks may not be achieved. Additionally, some customized interaction commands also require a certain learning cost for the users. To realize a more intuitive HMI for parallel manipulation with enhanced degrees of freedom in the applications of advanced industrial automation or virtual reality interactions, the sensory information of the human pose is of great significance [194,195,196,197]. Zhu et al. proposed a customized exoskeleton enabled by triboelectric bidirectional sensors for upper limbs’ joint motion monitoring as depicted in Figure 3g [172]. With the well-designed grating patterns and the bistable switch integrated into the sliding-mode TENG rotational sensor, the rotating degree, direction, and speed can be achieved simultaneously to accurately reflect the real-time status of shoulder rotations, wrist twisting, as well as finger bending with a minimum resolution of 4°, and can be further utilized to collaboratively control the robotic arms and virtual character in both real and cyber space. Moreover, a ping-pong/boxing game was demonstrated to verify the capability of the control system for complex and coherent movements, revealing its potential for virtual sports training and rehabilitation applications. Apart from the movements of upper limbs, spinal bending is also essential towards a more complete pose monitoring. A badge-reel-like stretch sensor based on a similar grating-structured TENG was reported by Li et al. as shown in Figure 3h [173]. By analyzing the generated peaks’ number, high sensitivity of 8 V mm−1 and a minimum resolution of 0.6 mm with good robustness and low hysteresis can be achieved for this wearable badge reel, which can be used for patients’ spinal shape change monitoring when serving as a rehabilitation brace. In addition, such a kind of spinal information is valuable when combined with other limb motions towards a whole-body movement detection, which can be widely used for human motion capture and reconstruction in the 3D animation/game industry.

4. Robotic-Related HMIs

With the gradual rollout of AI technology around the world, intelligent robots will play a more important role in our society and will gradually replace humans in labor-intensive or dangerous tasks, from the industries of service, manufacturing and medical, to daily life assistant, as well as future scientific-related space exploration [198]. As the medium to perceive the external world and enhance the interactions with humans, sensors based on TENGs have also been investigated a lot to mimic the bionic sensory system for robots’ tactile sensing [199,200,201,202,203], gesture/motion monitoring [204,205], gait analysis [206], etc., thanks to their good compatibility brought by TENGs’ wide material choices.
As shown in Figure 4a, a typical TENG tactile sensor for robot perception was reported by Yao et al. [207]. With the novel interlocking microstructures enabled by the surface morphology of natural plants, and the polytetrafluoroethylene (PTFE) tinny burrs fabricated on the tribo-layers, the pressure sensing sensitivity was effectively enhanced by 14-fold compared with sensors using flat tribo-surfaces. Owing to the high flexibility, the developed sensor can be easily attached to a robotic hand, to measure the pressure distribution, as well as finger bending, during handshaking with humans. Additionally, the capability of the sensor for surface roughness and object hardness recognition was also demonstrated, showing its potential for more advanced robotic dexterous operation and human–machine interactions.
In addition to the perception of external tactile stimulation, auditory information is also a straightforward communication strategy for robots to receive feedback information from the outside world and interact with humans [208,209]. A TENG-based auditory system for social robotics was reported by Guo et al. as illustrated in Figure 4b [210]. This developed tiny auditory sensor shows ultrahigh sensitivity of 110 mV/dB based on the triboelectricity generated by contact-separation motions of the Kapton and FEP layers during the air flow-induced vibrations. With the special design of the annular or sectional inner boundary architecture, a broadband response from 100 to 5000 Hz could be achieved, which almost covers the human voice’s frequency range. After being integrated onto a smart robot, the auditory sensor was successfully utilized to capture the music sound waveform with high quality, which can be further used for high-accuracy voice recognition in human–robot interaction applications.
During the parallel control process of robots, the sensing of the robot’s pose and motions is also critical, which can be used as feedback information to achieve more precise control and status monitoring [211,212]. As depicted in Figure 4c, Wang et al. reported a self-powered angle sensor that can be mounted on robotic arms for high-resolution angular monitoring [213]. Two rotary sliding TENGs are integrated with the difference in the overlaps of electrodes, to form a detectable phase difference between the two output electrodes for rotating direction detecting, i.e., clockwise or anti-clockwise. By counting the generated pulse number and calculating the corresponding consumption time, the overall rotation angles and the angular velocity can be obtained respectively, with a high resolution (2.03 nano-radian), high sensitivity (5.16 V/0.01°), and good signal-to-noise ratio (98.68 dB). A robotic arm equipped with this angle sensor was successfully controlled to reproduce the traditional Chinese calligraphy, proving the effectiveness of the collected signals for accurately reflecting the movement trajectory of the robot.
Thanks to the merits of good flexibility and multi-degree deformation [214,215,216], soft robots [217,218,219] made of soft materials, e.g., silicone rubber, thermoplastic polyurethane (TPU), etc., can perform conformal contact with external objects and environments, which make them applicable to various scenarios instead of making specific designs for different product lines like the widely used rigid robotic manipulators, thereby greatly reducing the costs. Considering that many commonly used TENG and soft robot materials have similar Young’s modulus, flexible TENG-based sensors have been frequently developed recently to realize the tactile/deformation perception for soft robots with good compatibility [220,221,222,223].
A TENG robotic skin for soft robot tactile perception was proposed by Lai et al. as shown in Figure 4d [224]. The sensing skin is made of silicone rubber with triangular micro prisms patterned on the surface to enhance the pressure sensitivity (9.54 V kPa−1). With the high stretchability of 100%, the sensor could be easily integrated into a soft gripper to sense the different motions during grasping tasks, including approaching, grabbing, lifting, lowering and dropping, according to the variation of the open-circuit voltage in different stages. The self-muscle motion perception of a robotic crawler could also be achieved by integrating multiple tribo-skins, showing its scalability for large-area soft robot tactile sensing. Besides the tactile sensing skin, Jin et al. also integrated a gear-structural TENG bending sensor into a soft gripper to form a multifunctional sensory system to detect the tactile and deformation related information simultaneously as indicated in Figure 4e [225]. With the distributed electrodes along with the tactile sensor patch, the accurate position of the external stimuli on the sensor surface could be extracted based on the output ratio of different channels. Moreover, the continuous bending monitoring of the pneumatic finger could be realized by counting the generated peak numbers of the TENG bending sensor, with a minimum resolution of about 12°. By fusing the data from these two types of sensors, more valuable information, including grasping position, contact area, object shape and size, could be achieved for grasping tasks, providing the possibility for the soft manipulator to implement high-accuracy object recognition with the help of machine learning analysis.
However, as mentioned in the tactile sensor part, the pulse-like output signals make the TENGs more sensitive to the external dynamic stimuli, and the real-time static status monitoring, i.e., deformation or pressure, of the soft robots can be achieved by fusing with other sensing mechanisms, e.g., capacitive, resistive, photonic, etc., towards a multifunctional sensory system [229,230]. As shown in Figure 4f, Bu et al. proposed a triboelectric-photonic smart skin for robots’ tactile and gesture sensing [226]. By doping the aggregation-induced emission (AIE) powder into the flexible silicone rubber substrate with the grating-structured metal film for exposure area adjustment, the photoluminescence and photocurrent are tunable for continuous tensile measurement under the lateral stretching range of 0–160%. Due to the high electron affinity ability of the silicone rubber, the triboelectric output can also be generated when external stimuli occur, which can be used for vertical static pressure detection with a maximum sensitivity of 34 mV Pa−1. After integrating onto a robotic hand, the precise joint bending and touch pressure monitoring can be achieved simultaneously, demonstrating the applicability of the triboelectric-photonic fused sensory system for soft robot-related HMI applications. Another hybridizing strategy based on the triboelectric and potentiometric for soft robot perception was reported by Wu et al. as illustrated in Figure 4g [227]. Inspired by the slow adapting (SA) and fast adapting (FA) capabilities of human skin, the potentiometric sensing mechanism that is sensitive to the static or slowly varying stimuli is utilized to detect the compressive strain induced by the external force, while the triboelectric mechanism suitable for dynamic stimuli sensing can provide the instantaneous signal information at the beginning or ending moments of the stimulation. With this multifunctional sensing capability brought by the complementary effects of these two mechanisms, more valuable information, including the pressure and duration during the process of approaching, touching, holding and releasing, can be obtained towards a more detailed object manipulation monitoring for soft robotic grippers.
Expert for these humanoid robots, animal-like exploration robots have also become a hot topic recently. By sending these robots to natural or harsh environments where people cannot go, valuable sensory data or physical tasks could be achieved for environmental monitoring or space exploration purposes [231,232,233,234]. To endow the robots with the capability to respond to complex environmental situations, e.g., avoiding obstacles, etc., varying advanced sensing technologies based on the mechanisms of piezoresistive, piezoelectric, optical and magnetic have been investigated [235,236,237,238]. However, thanks to the advantages of lightweight and low power consumption, TENGs have become a new strategy for developing self-sustainable exploration robots [230]. As shown in Figure 4h, inspired by the hair-based sensory system that animals used to explore the environment, An et al. developed a TENG-enabled self-powered biomimetic whisker mechanoreceptor for robotic tactile sensing [228]. The sensor consists of a fluorinated ethylene propylene (FEP) layer serving as the animal whisker, covered by a biomimetic hair follicle made of two metal electrodes. When the whisker swings between the two electrodes, the potential distribution will change due to the triboelectricity and electrostatic induction, which can be used to reflect the deflection direction and amplitude based on the signs and magnitude of the transferred charges, respectively. When integrated into a quadruped robot, the biomimetic whisker can help robots detect the movements of surrounding objects by analyzing the amplitude and frequency of the vibrational signal. The real-time pressure applied on its feet can also be achieved to reflect the gait and ground environment information, which can effectively help it pass complex roads in harsh environments for exploration applications.

5. HMIs for Smart Home Applications

The rapid development of numerous smart electronics enables a large variety of applications in the ambient environment to realize the smart home, with the intelligent monitoring and response systems for healthcare monitoring, elderly/children care, fall detection, body motion monitoring, automation, and security [239,240]. The current mainstream technologies for smart home applications include camera-based image recognition and commercial humidity/temperature sensors [241]. Although these technologies are continuously developing, drawbacks still exist such as video-based privacy concerns, large power consumption, bulky volume, and rigidity. Considering the fast development of the energy harvesting/self-powered sensing technology (triboelectric, etc.), lowpower HMIs can be realized with high convenience, low cost, and self-powered sensing ability to reduce the overall power consumption and extend the lifetime of the system [5,242,243]. Additionally, because of the energy harvesting capabilities, these HMIs can also scavenge the ambient energy and convert it into electricity for potential wireless communication.
As a good substitute for mechanical switches, flexible touchpads with multiple arrays/units have been investigated frequently by using TENG technology to achieve self-powered HMIs to interact with household appliances [244,245,246,247]. For these multi-array touchpads, there are two common ways for wire connection. One is connecting each unit to an output channel, which will lead to a messy wiring layout with complex processing circuits. Another method is to build cross nodes of X-axial channels and Y-axial channels, by which the total output channels will be reduced greatly, but the issues of crosstalk between the intersected channels could not be avoided. To solve these problems, Pu et al. developed a subdivision-structural 3D touchpad as illustrated in Figure 5a [248]. With the novel subdivision structure, the overlapping area between the intersected electrodes is significantly reduced, which can effectively suppress the crosstalk problems and improve the position sensing resolution. Additionally, the pressure sensing function was also realized by integrating another three-layer TENG unit for the touch or press differentiation purpose. This sensing array was successfully further designed into an anti-peek built-in code lock, where the information of the position and pressure can be integrated to form a more complex access password compared with the methods that are only based on the number location, effectively improving the safety factor of the authorization system in the smart home.
Another smart method to simplify the output channels of touchpads is using binary codes to define more functions with as few electrodes as possible. As shown in Figure 5b, Qiu et al. reported a sliding-mode TENG-enabled control disk interface based on the binary coding mechanism [249]. Two sensing electrodes with specific patterns are attached to the panel, where one serves as the binary bit “1”, and the other serves as the binary bit “0”. Due to the variation in the width and location of these two electrodes along with different sliding directions, eight sensing transitions can be achieved based on the 3-bit binary code, e.g., “001”,”010”, etc., for smart appliances control. Additionally, the sliding speed can also be utilized to adjust the light brightness or fan speed. After integrating a soler cell to form a hybrid energy harvester, the self-sustainable capability could be achieved by scavenging both the light energy and the mechanical energy from hand tapping. Moreover, this control disk can also be designed into a password input interface with more than 262,000 potential combinations for smart home authentication applications.
Apart from the password input panel, the barcode is also a widely-used strategy as the information carrier for personal identification/authentication applicable in smart home applications [255,256], and can be developed with the TENG technology towards a self-powered identification system. As depicted in Figure 5c, a triboelectric-based transparent secret code was proposed by Yuan et al. [250]. The information is hidden in the patterned indium tin oxide (ITO) electrodes, with a fluorinated ethylene propylene (FEP) film covered on top serving as the tribo-layer. The elongated design of the ITO stripes can effectively enlarge the contacting area within a short sliding time, and the different lengths will contribute to the variation of the output amplitudes, which can be further translated into the binary information of “1” or “0” with a reasonable threshold value by a sliding-check or roll-to-roll reader for security defense purposes. However, the peak height enabled sensing method is sliding velocity-dependent, and the non-uniform sliding speed may result in the error in the peak amplitudes, as well as the corresponding identified information. To solve this issue, Chen et al. proposed to use a reference barcode component to improve the reliability of the identification system [251]. As illustrated in Figure 5d, the information card is double-sided, where one side is patterned with the standard barcode electrodes with equal intervals as the reference component, and the other side is patterned with the information barcode that is aligned, swiped and measured simultaneously with the reference one. By comparing the number and positions of the generated positive and negative peaks from two sides, the coded information could be easily converted into the digital information of “1” and “0” with high accuracy even under non-uniform sliding speed by human hands, demonstrating a more reliable coding method for self-powered access system in practical usage.
In this fast-paced era, people are under increasing pressure from work and life, leading to more and more people being annoyed by the sleeping disorder, which may further increase the risk of other health problems, e.g., obesity, heart disease, and diabetes [257]. To realize the long-term sleep behavior monitoring for sleep quality assessment, many works have been done based on real-time pressure sensing [258,259]. However, most of these technologies are still restricted by the issues of low sensitivity, high fabrication cost, and non-washability, limiting their practical applications for large-area sleeping monitoring in daily usage. Recently, the fast development of TENG-based textile technology shows the great potential of TENGs to be further designed into smart clothes for wearable or household scenarios with high sensitivity and low cost [260,261,262]. By utilizing this technology, a large-scale and washable smart textile based on TENG arrays was developed by Lin et al. as shown in Figure 5e [252]. The proposed bedsheet has three layers, where a layer of wave-shaped PET film is sandwiched by two layers of Ag-coating conductive fabrics. When external stimuli occur, the applied pressure will enlarge the contact area between the PET layer and the two conductive layers, thus resulting in the electrical potential change and generating the outputs. With this optimal design, the smart textile shows a good sensitivity of 0.77 V Pa−1 with a fast response time (<80 ms), as well as high durability and stability even after being washed in tap water. By connecting multiple TENG units to form a sensing array, the information of the body’s posture, position and pressure distribution over an entire night could be collected, providing a reliable dataset for analyzing the sleep quality. Additionally, the smart bedsheet can also serve as a warning system to prevent the elderly from falling off the bed, demonstrating a new possibility to realize the real-time remote healthcare service in the smart home.
To avoid the privacy concerns introduced by cameras, floor-embedded sensors can be implemented to extract the abundant sensory information associated with human activities in the smart home, such as the indoor position and gait-based individual identity [263,264,265]. Thanks to the advantages of low cost, easy fabrication and wide choices for triboelectric materials, TENG-based sensors commonly show good scalability and are compatible with large-scale manufacturing, making them suitable to be further designed into the floor-based HMIs [266,267,268]. A self-powered identity recognition carpet system enabled by TENG-based e-textile for safeguarding entrance was reported by Dong et al. as illustrated in Figure 5f [253]. The carpet consists of 128 black and white squares of the same size, where the black blocks are attached with the TENG fabrics serving as the sensing region, and the white blocks are used to reduce the interference between the adjacent sensing regions. According to the generated output peaks in these black sensing blocks, the walking trajectories of a visitor can be mapped with high accuracy and good stability, which can be further compared with the correct password path to validate the authentication. However, in this design, each sensing block is connected to an independent output channel, which means a complex wire connection and high signal processing cost. To simplify the output channels of the multi-array sensing system, a more advanced design of the smart mat was reported by Shi et al. as shown in Figure 5g [254]. Six distinct electrode patterns with varying coverage rates, i.e., 0% to 100% with 20% intervals, were designed and fabricated by screen printing, acting as sensing arrays for this floor mat. Due to that different electrode areas will contribute to different amounts of induced charge, the arrays are self-distinguishable and can be connected in an interval parallel manner to form a 3 × 4 mat array with minimal two-electrode outputs. So that the indoor positioning and activity monitoring can be achieved with minimal output terminals and minimized system complexity, which also benefits the backend signal processing and data analysis. Furthermore, with deep learning enabled data analytics, the identity information associated with gait patterns can be extracted from the output signals, and high recognition accuracy of 96% could be achieved for 10 persons based on their specific walking gaits, enabling diversified applications in the smart home such as position sensing, activity/healthcare monitoring, and security.

6. ML-Enabled Advanced HMIs

In terms of complex data analysis, the new trend of AI-enabled machine learning has shown a new direction for enhancing the functionalities of sensors [1,269,270]. Due to the powerful feature extraction capabilities of machine learning, more comprehensive/detailed sensory information can be utilized to realize diversified applications, e.g., gesture/pose estimation, voice recognition, object recognition, etc. [81,87,164], for advanced human–machine interactions as illustrated in Figure 6.
Recently, several AI-enabled TENG-based HMIs have been successfully developed [89,92,276,277,278] based on the algorithms of support vector machine (SVM), neural network (NN), etc., where the subtle features hidden in the triboelectric waveform, including contact sequence, impact vibration, etc., have been proven to effectively enhance the recognition capability of the intelligent sensory system. Compared with state-of-art works [279,280,281] that using a large number of resistive/capacitive sensor nodes for high-accuracy ML analysis, the minimalistic approach of TENGs shows comparable performance with significantly lower power consumption. Here, in this section, some typical examples of TENG-based intelligent HMIs for diversified applications are reviewed.
For cybersecurity applications, keystroke dynamics enabled authentication system has been proven as an effective approach to enhance the security level based on people’s typing attributes with non-invasive monitoring characteristics [282,283,284]. A TENG-based smart keyboard for keystroke dynamics monitoring was developed by Wu et al. as illustrated in Figure 6a [271]. There are a total of 16 silicone-based keys with high flexibility and stretchability forming a multichannel keypad array, where each key consists of a contact-separation-mode TENG to convert the typing behavior into electrical signals, and a shield electrode to minimize the environmental interference. By using an analog-to-digital converter (ADC) to collect the open-circuit-voltage signals, keystroke-related features, i.e., typing latencies, hold time and signal magnitudes, can be acquired simultaneously with specific signal processing, e.g., denoising, baseline elimination, etc. Following the principal component analysis (PCA) for feature dimensionality reduction, a multi-class SVM classier was utilized to recognize the identities of 5 users based on the established dataset (150 sets of data for each user), and high accuracy of 98.7% could be achieved, showing the feasibility of the keystroke dynamics enabled authentication system for practical usage.
In addition to the keystroke dynamics, identification based on gait analysis is also a promising technology for biometric authentication applications [46,285]. With the help of artificial intelligence, complex personal information regarding the identity, health status as well as real-time activity of the users could be delivered at the same time by analyzing the gait patterns acquired from the floor- or sock-based sensory system. Zhang et al. proposed deep learning-enabled TENG socks for gait analysis as depicted in Figure 6b [272]. A textile-based TENG pressure sensor, consisting of a silicone rubber film with patterned frustum structures as the negative tribo-layer, a nitrile thin film as the positive tribo-layer and two conductive textiles as output electrodes, was fabricated and integrated onto a smart sock for gait monitoring with high sensitivity (0.4 V kPa1) and large sensing range (>200 kPa). With an optimized four-layer one-dimensional (1D) convolutional neural network (CNN) model for automatic feature extraction from the original walking spectrums, a high recognition accuracy of 96% could be achieved for 5 participants with varying weights, and could still be maintained higher than 93.5% when the number of people increased to 13, making it applicable for most indoor scenarios, e.g., home or office. By combining this intelligent sock with an IoT module for wireless communication, an artificial intelligence of thing (AIoT)-enabled two-stage recognition platform was established at the cloud server to realize the functions of family member identification and real-time indoor activities, i.e., run, walk and jump, monitoring simultaneously, revealing its potential for future smart home applications.
With AI for comprehensive sensory information extraction and autonomous learning, sophisticated hand gestures could be discriminated for glove-based HMIs towards advanced control or sign language interpretation applications [286,287]. Though several works have demonstrated the feasibility of developing TENG-based sign language perception platforms with minimalistic design and high recognition accuracy [131,139,288], most of them are limited to the identification of only several discrete and simple words or letters and are not suitable for real-time sentence recognition. To deal with these issues, Wen et al. developed a more advanced TENG-enabled sign language recognition system with improved glove design and training strategies as shown in Figure 6c [273]. Apart from the strain sensors that are mounted on each finger for finger bending monitoring, some sensors are placed on the wrists, fingertips and palms of the signers to capture subtle features towards more comprehensive sign language gestures. By utilizing a 5-layer CNN for feature extraction, high accuracy of 91.3% and 93.5% could be achieved for 50 words and 20 complete sentences respectively based on a non-segmentation method. To further extend its potential for new/never-seen sentences recognition, a segmentation strategy was used, where the signal spectrum of a whole sentence was split into word units first and then based on the correlation between the word units and the sentence, the original sentence’s information could be reconstructed by the AI framework with an accuracy of 85.58%. With this novel learning approach, new/never-seen sentences could also be identified with an average accuracy of 86.67%, showing a new methodology to effectively expand the dataset and improve the practicality of the sign language recognition system.
As mentioned in the section on robotics-related HMIs, TENG-based sensors have been frequently investigated for developing low-power-consumption robot perception systems due to the advantages of high flexibility and self-powered property. However, most of the abovementioned works mainly focus on the simple manual analysis of the collected sensory information [221,223], e.g., deformation, tactile, etc. For realizing more advanced interactive functions, e.g., pose estimation, surface roughness perception and object recognition, AI-enabled analytics are needed, to capture more valuable features and endow the self-discrimination ability [225]. A TENG-enhanced smart soft robotic manipulator for AIoT enabled virtual shop applications was reported by Sun et al. as shown in Figure 6d [274]. A TENG bending sensor consisting of a rotating gear was utilized to monitor the real-time deformation of the pneumatic finger according to the generated peak numbers during the stretching process, and a TENG tactile sensor with distributed electrodes design was used for contact position and area detection. With the aid of 1D CNN ML algorithm for data processing, an intelligent robotic gripper that fuses these two kinds of sensory information could be easily achieved to realize a high recognition accuracy of 97.143% for 28 grasped objects with different shapes and sizes. Moreover, temperature distribution information could also be implemented by a poly(vinylidene fluoride) (PVDF) pyroelectric temperature sensor, towards a comprehensive and fully self-powered perception system. This AI-enhanced smart gripper with multifunctional sensing capability was further applied in a digital-twin-based virtual shop to provide users with real-time feedback information of the goods, as well as a more immersive experience of online shopping, showing great potential to realize advanced human–machine interactions in unmanned working space.
Due to the existence of environmental interferences, the performance of sensors could be affected in different environmental conditions, e.g., humidity effect for triboelectric sensors, etc. [127], which may result in poor stability of the established recognition system. For those TENG HMIs based on the sensing mechanism of grating sling or output ratio as mentioned above, though the output amplitudes will be influenced under varying environmental conditions, the generated peak numbers or the output ratio of distributed electrodes will not change, resulting in strong resistance to environmental interference. So, the ML result achieved from such kinds of output data will show better robustness without the effect of environmental elements. While for the ML results obtained from the real-time output spectrum, the variation in voltage amplitude under different environmental conditions will inevitably affect the results of machine learning. However, if we collect the data under different environmental conditions and combine them into a more generalized data set, i.e., each category contains the data that captured under different environmental conditions, the influence of environmental elements on accuracy will be avoided to a certain extent due to the more generalized trained model. Similarly, the influence from the different human behavior habits could also be avoided by enhancing the generalization ability of the dataset, where the data from different users are collected, thus taking into account the individual differences.
Another possible solution for this issue is to fuse sensors based on different mechanisms to build a more robust sensory system by utilizing the complementary effects of different sensors [289,290,291]. In Figure 6e, Wang et al. proposed a bioinspired deep-learning-based data fusion architecture that integrates the vision data and somatosensory data for high-accuracy gesture recognition in a harsh/dark environment [275], where the somatosensory information could contain complementary features for maintaining recognition accuracy especially when images are noisy and under- or over-exposed. To ensure high compatibility with visual information, a highly transparent stretchable strain sensor using single-walled carbon nanotubes (SWCNTs) was developed, which can collect high-quality somatosensory data without affecting the visual information. With the visual data captured by commercial cameras, a bioinspired somatosensory-visual (BSV) associated architecture was then reported for further multimodal data fusion, which can mimic the biological visual and somatosensory interactions in the area of the multisensory neuron of the human brain. By using this approach for human gesture recognition, high accuracy of 100% can be maintained even with low-quality images, proving the complementary effect of the somatosensory information on vision-based gesture recognition. Additionally, this architecture has also been demonstrated for more accurate robot manipulation, where 98.3% accuracy could be achieved with enough illumination and 96.7% accuracy could still be maintained in the dark environment, illustrating its applications for human–machine interactions in harsh environmental scenarios. Such a data fusion strategy provides a good development direction for future TENG-based HMIs with better stability and applicability.

7. Haptic-Feedback Enabled HMIs

The completeness of an HMI system not only rely on the dexterous sensing units which can monitor various human physiological signals and motions, but also requires a specialized feedback module that can provide necessary stimulations to assist the cognition of the manipulation status, so that a control-feedback loop is able to be established [292]. Noticeably, the consistency between the artificially stimulated sensation and the real sensation to humans is the key to improve the user experience and the value of the feedback information. Therefore, a majority of the feedback research is also focusing on the biomimetic stimulators, include kinesthetic feedback which can reflect the spatial movement of different body parts, and cutaneous feedback which is in charge of performing the tactile and thermal stimulations to various skin’s receptors [293,294,295,296].
With the concern of conformability to the human body, wire and pneumatic actuators, as commonly used actuation techniques, are also adopted in feedback systems for achieving kinesthetic actuation [297,298]. Kang et al. reported a polymer-based soft wearable exo-glove using tendon-driven feedback, as shown in Figure 7a [299]. The main functional part consists of a body with stretchable designed finger modules, and the thimbles which are connected to the corresponding finger modules. The actuation wires are then covered by the sheaths embedded in the main body and linked to the motor box. This device can help disabled people to recover the capability of hand grasping. Motor-based vibrators are frequently used in mobile phones and joysticks as feedback units. Yu et al. developed a skin-integrated wireless haptic interface for VR and remote interactions by making the electromagnetic vibrator array (Figure 7b) [300]. This elastomer packaged device can be directly attached to the human skin, and it is powered wirelessly via an NFC coil. The design of serpentine Cu connectors ensures the stable performance under strain.
Dielectric elastomer actuator (DEA) is becoming another popular research direction in recent years. The electric field induced deformation of DEA can be reshaped into vibration, stretching, bending and other modes, by applying the different designs [312]. In Figure 7c, Wang et al. presented a hydraulically amplified self-healing electrostatic actuator that can mimic the muscle contraction motions under the applied electric field [301]. The actuator is made of rectangular polymer shells filled with a liquid dielectric, and the electrodes are coated at both ends. Hence, the applied voltage can induce a Maxwell stress cause the electrode to zip together and the shell is then contracted due to the hydraulic pressure. This actuator achieved a maximum linear contraction of about 24% at a loading of 0.2 N. On the other hand, as shown in Figure 7d, a feel-through low-voltage DEA made by multi-layer PDMS based DEA sandwiched by single-walled carbon nanotube (SWCNT) electrodes for fingertip haptic feedback is demonstrated by Ji et al. [302]. The combination of three active layers is only 18 µm thick, which can ensure the mechanoreceptors of the skin remain sensitive to external stimuli. By applying the voltage, the surface area of the elastomer increases or decreases to stretch or compress the skin for feedback. Similarly, Pyo et al. presented a pyramid microstructured DEA layer. The applied AC voltage can also deliver the vibrational stimulus (Figure 7e) [303]. Interestingly, Hwang et al. developed a light-driven and low-power vibrotactile actuator using thin poly(3,4-ethylenedioxythiophene) doped with p-toluenesulfonate (PEDOT-Tos) and PET film, as illustrated in Figure 7f [304]. The irradiation of near-infrared (NIR) light from LED can initiate a thermoelastic bending deformation due to the mismatch of thermal expansion coefficient. Therefore, with the assistance of LED array, a programmable vibrotactile feedback system is demonstrated.
There are also researches focus on electrostatic force-based feedback. The applied voltage can cause the attraction of two electrodes under the difference of electrical potential. In Figure 7g, Hinchet et al. designed a high force density electrostatic clutch for making the VR feedback glove [305]. The clutch is simply made by the conductive textile with poly vinylidene fluoride, trifluoroethylene, 1,1-chlorotrifluoro- ethylene (P(VDF-TrFE-CTFE) based high friction insulation layer. The proposed device can generate frictional shear stress up to 21 N cm−2 at 300 V, which can be used for blocking the finger motion during VR events. In the meantime, Leroy et al. presented a multimode electrostatic actuator with hydraulically amplified haptic feedback, as illustrated in Figure 7h [306]. The liquid dielectric is encapsulated within a chamber formed by the flexible membrane and central stretchable membrane. The electrode pairs are then located at the surroundings of the center. The voltage-induced electrostatic force will force the electrode to attract and squeeze the outer region of the chamber, and hence, the central membrane will be expanded under hydraulic pressure. By controlling the electrode designs and activation sequences, the device can realize the multimode haptic feedback.
To enhance the feedback functionality of the pneumatic actuator with minimal air inlet, various techniques are also integrated to tune the activation region of the device. Besse et al. reported a large reconfigurable pneumatic haptic array with the aid of flexible shape memory polymer (SMP) membrane, as depicted in Figure 7i [307]. The heater can change Young’s modulus of the SMP membrane and allow it the deform under positive or negative air pressure. As a result, the array design of the chambers and membranes enables the dense reconfigurable tactile system. On the other hand, Qu et al. developed a refreshable braille display system based on pneumatic actuation and TENG-based DEA (Figure 7j) [308]. A similar air chamber with a membrane design is adopted. As mentioned earlier, the DEA can be stretched under applied TENG voltage. Together with the air pressure, the DEA membrane is then raised up to form a braille dot.
Moving forward, TENG is also directly used as a feedback unit via the generated voltage. As shown in Figure 7k, Shi et al. designed a TENG array with ball electrodes to make a feedback system for VR [309]. When the slider is sliding across the TENG array, the electrical output can be generated via the triboelectrification effect. With the direct contact of the ball electrodes to human skin, the TENG electrostatic discharge can be delivered as electrical virtual tactile stimulation. Hence, the sliding trajectory can be sensed on the skin as a feedback function.
Thermal sensation, as another important function of human skin, can bring a new dimension of feedback to enrich the reconstruction of the environment in cyber or remote space. The collection of temperature information is also critical to prevent potential damage during remote control. Oh et al. developed a multimodal sensing and feedback glove with a resistive strain sensor, vibrator, and thermal feedback units (Figure 7l) [310]. By applying the direct ink writing technique, eGaIn liquid metal is printed into meandered shape to make the units with both strain sensing and thermal feedback functions via the power supply. Moving forward, the capabilities of both heating and cooling is the next challenge for those wearable and flexible feedback systems. As illustrated in Figure 7m, a skin-like thermo-haptic device with thermoelectric units is developed by Lee et al. [311]. This device consists of Cu serpentine electrode, thermal conductive elastomer, and n-type/p-type thermoelectric-based pellets. Based on Peltier effect, this flexible thermo haptic skin can alternate the heating and cooling modes with the temperature difference of 15 °C.
In general, haptic feedback technologies can drastically boost up the capability of information interpretation, instead of receiving digitized data. Equipment with wearable feedback systems not only paves the way for immersive interaction, but also improves the efficiency of manipulation via rapid cognition of the target environment [313,314]. The research of more feedback parameters, such as smell and airflow, will eventually establish a fully biomimetic feedback system.

8. Towards Self-Sustainable/Zero-Power/Passive HMI Terminals in the 5G/IoT Era

In this information age, the expansion of the Internet of things not only comes from mobile phones, tablets, and computers, but also thanks to other millions of smart devices [315] that are connected to the Internet through wireless communication technologies, e.g., WI-FI, Bluetooth, etc. Though the sensors based on self-powered mechanisms [154,316], i.e., triboelectric and piezoelectric, in the IoT devices may do not need a power supply, such signal transmission modules still mean high power consumption, especially when there are massive sensor nodes under the IoT framework. In addition, as the traditional power supply for portable devices, batteries experience drawbacks of high contamination and limited lifespan, as well as the annoying replacement or recharging process. To effectively extend the lifespan of portable or remote devices, energy harvesters, which can convert biomechanical energies or wasted energy in the ambient into electricity based on the mechanisms of electromagnetic, piezoelectric, and triboelectric, have been frequently investigated recently to replace batteries as the power supply units towards self-sustainable IoT systems/HMIs [317,318,319].
As shown in Figure 8a, an intelligent walking stick, consisting of a top press TENG for contact point sensing, a rotational TENG for gait abnormality detection and a rotational electromagnetic generator (EMG) for energy harvesting, was reported by Guo et al. [320]. With the deep-learning-enabled data analytic, complicated sensing functions including disability evaluation, identity recognition and activity status identification could be implemented through the TENG sensor outputs. Additionally, the linear-to-rotary structure enables the integrated EMG to harvest the ultra-low-frequency biomechanical energy from human motions and provide an average output power of 27.5 mW under 1 Hz stimuli. By stacking two EMGs for higher power output and using customized circuits for power management, a self-sustainable system capable of locomotion tracing and temperature/humidity monitoring was successfully achieved, where a temperature/humidity sensing module could be driven continuously with 1 Hz working, and a GPS and wireless module could work 6 s after every 90 s charging. Moreover, by transmitting the TENG sensor signal to an AI cloud/server via the 5G network for further analysis, real-time monitoring of the user’s location and well-being status could be achieved in outdoor environments, showing the feasibility of sustainable HMIs enabled by energy harvesters for future IoT applications.
However, current sustainable HMIs enabled by the integrated energy harvesting units are not suitable for continuous wireless monitoring or fast information exchange due to the inevitable charging phase [61,324,325,326]. The additional energy harvesters or power management circuits also increase the size and cost of the whole device, reducing the applicability for wearable scenarios. To these issues, some works chose to utilize the high-voltage transient of the TENG to realize wireless coupling, and achieve complete zero power consumption on the sensor terminals with the minimalistic design [327,328]. As shown in Figure 8b [321], Mallineni et al. reported a wireless TENG tactile patch consisting of graphene polylactic acid (gPLA) nanocomposite and Teflon serving as the tribo-layers, which can generate a high electric field in the surrounding air enabling ~3 m wireless sensing, thanks to the extremely high output voltage (>2 kV) through simple mechanical stimuli, e.g., hand tapping. By connecting a customized signal processing circuit to the receiver, the hand tapping signal could be real-time collected wirelessly without a power supply or even signal transmitter at the sensor side, realizing a truly zero-power sensing terminal. Such a self-powered HMI could function as a controller for diversified smart home applications, e.g., activating lights, displays, photo frames, or even security systems via Morse code based passwords, showing great advantages in terms of size and cost when compared with conventional HMIs that need bulky power supply components or wireless circuit modules. However, the wireless sensing solution using signal amplitude as the sensing parameter is easily affected by the ambient environment, so a more advanced self-powered wireless network that detects the frequency of the TENG output via the mechanical switch enabled frequency boosting up strategy was proposed by Wen et al. as shown in Figure 8c [322]. With the enhanced frequency brought by the switch-induced instantaneous discharging, the TENG output signal could be easily transmitted wirelessly through a couple of coils, serving as a reliable reference for force calibration with high sensitivity (434.7 Hz N−1) and a large sensing range thanks to the stable resonant frequency. By changing the wire connection of TENG sensor layers, i.e., in series or parallel, or adjusting the capacitors connected to different pixels in a sensor array, the tunable frequency could be achieved for complicated manipulation, e.g., the multiple-freedom-degree 2D car control or 3D VR drone control, revealing the feasibility of zero-power HMI based on the TENG-enabled frequency shift wireless sensing for IoT applications.
Another strategy to realize battery-free sensor nodes with continuous sensing capability under the IoT framework is to use passive wireless technologies, including near-field communication (NFC) [329,330], radio frequency identification (RFID) [331,332], and surface acoustic wave (SAW) sensors [333,334]. Compared with the abovementioned wireless transmission methods based on the TENG output enabled electric/magnetic coupling, these passive wireless approaches commonly show better stability in varying environmental circumstances, as well as smaller wireless sensing and receiving units. A passive wireless TENG sensor using a surface acoustic wave resonator (SAWR) was reported by Tan et al. as illustrated in Figure 8d [323]. By connecting a TENG force sensor to a SAWR, the frequency and amplitude of the SAWR’s response signal can be modulated via the TENG output, making the radio frequency (RF) signal contain the tactile related sensory information which can be further extracted by the RF reader through the demodulating process. All the energy in this wireless transmission system is provided by the remote RF reader, making the sensor terminal a completely passive sensing node capable of continuous monitoring with a transmission distance larger than 2 m and high sensitivity of 23.75 kHz V−1. This system could also act as a passive controller for trigger signal detection and Morse code based wireless manipulation, or as a wireless matrix keyboard with resonators working under different center frequencies, showing the promising prospects of distributed sensing or human–machine interaction in this 5G/IoT era.

9. Conclusions and Prospects

With the advantages of the wide material choices and diversified structural designs, TENG-based sensing technology has been frequently investigated towards low-power HMIs in this 5G/IoT era. In this review, we systematically summarize the key technologies and progress of TENG-based HMIs in terms of different application scenarios, including wearable HMIs, robotic-related HMIs and HMIs for smart home applications. By using the ML analytic for automatic feature extraction, advanced interactions, such as biometric authentication, gait analysis, gesture recognition and object recognition, can be implemented to enhance the functionalities of TENG-based HMIs towards intelligent interactive systems in the age of AI. The state-of-art haptic feedback technologies have also been reviewed and discussed, which yields a promising research direction for developing self-driven immersive HMIs when bringing in the TENG technology. Furthermore, the integration of self-powered TENG sensors and energy harvesters or passive wireless transmission technology also provides the possibility to realize battery-free/self-sustainable HMI terminals.
Despite the viable progress in developing TENG-based HMIs for diversified applications, there are still challenges that remain to be solved for the current solutions. First, the peak-like output signals make the TENG-based sensors only suitable for dynamic stimuli sensing and cannot be used to detect the continuous variation due to the fast shift of electrical states. Though methods based on grating-sliding mode TENG have been frequently reported [133,172], the resolution is limited by the size and spacing of the grating, and is difficult to reach a very high level without advanced fabrication processes [335], e.g., MEMS process, screen printing, micro-machine, etc., which means high fabrication cost. Other methods based on high-impedance readout circuits [149] or nanophotonic modulators [135,336] need complicated measurement systems, which are unlikely to be made into portable sensing devices for daily usage. Therefore, a reliable and convenient way to achieve continuous sensing capability for TENG-based HMIs that is compatible with IoT mobile platforms is needed. Second, though the energy harvesting or passive wireless transmission technology shows the feasibility to realize battery-free/self-sustainable HMI terminals with TENG sensors, limitations still exist, e.g., intermittent operation for energy harvester integrated system due to the charging phase, short transmission distance for electric/magnetic coupling or passive wireless transmission, etc. Further investigation on battery-free/self-sustainable systems with continuous operation capability and long wireless transmission distance is still desired. Third, the durability of TENG-based HMIs for long-term usage, and the robustness of the sensor performance in varying environmental conditions are also major concerns, considering the inevitable friction layer loss and the triboelectric charge loss under the high humidity condition, which may cause instability of the output and bring great challenges to the sensor collaboration. Moreover, TENG sensors can be fused with other sensing mechanisms towards a multimodal sensor fusion with AI analysis [337] for multifunctional purposes or more robust performance under varying environmental conditions. In this regard, the seamless implementation of the low-power/self-sustainable TENG-based HMIs will definitely shed light on the harmonic coexistence of humans and machines in the future of the IoT era, along with the immersive and efficient interactions in numerous scenarios.

Author Contributions

Conceptualization, Z.S., M.Z. and C.L.; investigation, Z.S. and M.Z.; resources, Z.S. and M.Z., writing—review and editing, Z.S., M.Z., and C.L.; supervision, C.L.; funding acquisition, C.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Advanced Research and Technology Innovation Centre (ARTIC), the National University of Singapore under Grant (project number: R-261-518-009-720); “Intelligent monitoring system based on smart wearable sensors and artificial technology for the treatment of adolescent idiopathic scoliosis”, the “Smart sensors and artificial intelligence (AI) for health” seed grant (R-263-501-017133) at NUS Institute for Health Innovation & Technology (NUS iHealthtech); the Collaborative Research Project under the SIMTech-NUS Joint Laboratory, “SIMTechNUS Joint Lab on Large-area Flexible Hybrid Electronics”; and National Research Funding—Competitive Research Program (NRF-CRP) (R-719-000-001-281).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhu, M.; He, T.; Lee, C. Technologies toward next Generation Human Machine Interfaces: From Machine Learning Enhanced Tactile Sensing to Neuromorphic Sensory Systems. Appl. Phys. Rev. 2020, 7, 031305. [Google Scholar] [CrossRef]
  2. Yin, R.; Wang, D.; Zhao, S.; Lou, Z.; Shen, G. Wearable Sensors-Enabled Human–Machine Interaction Systems: From Design to Application. Adv. Funct. Mater. 2021, 31, 2008936. [Google Scholar] [CrossRef]
  3. Wang, H.; Ma, X.; Hao, Y. Electronic Devices for Human-Machine Interfaces. Adv. Mater. Interfaces 2017, 4, 1600709. [Google Scholar] [CrossRef]
  4. Arab Hassani, F.; Shi, Q.; Wen, F.; He, T.; Haroun, A.; Yang, Y.; Feng, Y.; Lee, C. Smart Materials for Smart Healthcare– Moving from Sensors and Actuators to Self-Sustained Nanoenergy Nanosystems. Smart Mater. Med. 2020, 1, 92–124. [Google Scholar] [CrossRef]
  5. Dong, B.; Shi, Q.; Yang, Y.; Wen, F.; Zhang, Z.; Lee, C. Technology Evolution from Self-Powered Sensors to AIoT Enabled Smart Homes. Nano Energy 2021, 79, 105414. [Google Scholar] [CrossRef]
  6. Principi, E.; Squartini, S.; Bonfigli, R.; Ferroni, G.; Piazza, F. An Integrated System for Voice Command Recognition and Emergency Detection Based on Audio Signals. Expert Syst. Appl. 2015, 42, 5668–5683. [Google Scholar] [CrossRef]
  7. Rautaray, S.S.; Agrawal, A. Vision Based Hand Gesture Recognition for Human Computer Interaction: A Survey. Artif. Intell. Rev. 2015, 43, 1–54. [Google Scholar] [CrossRef]
  8. Ionescu, D.; Suse, V.; Gadea, C.; Solomon, B.; Ionescu, B.; Islam, S. A New Infrared 3D Camera for Gesture Control. In Proceedings of the 2013 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Minneapolis, MN, USA, 6–9 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 629–634. [Google Scholar]
  9. Ivanov, A.V.; Zhilenkov, A.A. The Use of IMU MEMS-Sensors for Designing of Motion Capture System for Control of Robotic Objects. In Proceedings of the 2018 IEEE Conference of Russian Young Researchers in Electrical and Electronic Engineering (EIConRus), Moscow, Russia, 29 January–1 February 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 890–893. [Google Scholar]
  10. Kim, M.; Cho, J.; Lee, S.; Jung, Y. IMU Sensor-Based Hand Gesture Recognition for Human-Machine Interfaces. Sensors 2019, 19, 3827. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  11. Zhu, J.; Liu, X.; Shi, Q.; He, T.; Sun, Z.; Guo, X.; Liu, W.; Sulaiman, O.B.; Dong, B.; Lee, C. Development Trends and Perspectives of Future Sensors and MEMS/NEMS. Micromachines 2020, 11, 7. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Zhu, H.; Wang, X.; Liang, J.; Lv, H.; Tong, H.; Ma, L.; Hu, Y.; Zhu, G.; Zhang, T.; Tie, Z.; et al. Versatile Electronic Skins for Motion Detection of Joints Enabled by Aligned Few-Walled Carbon Nanotubes in Flexible Polymer Composites. Adv. Funct. Mater. 2017, 27, 1606604. [Google Scholar] [CrossRef]
  13. Amjadi, M.; Pichitpajongkit, A.; Lee, S.; Ryu, S.; Park, I. Highly Stretchable and Sensitive Strain Sensor Based on Silver Nanowire–Elastomer Nanocomposite. ACS Nano 2014, 8, 5154–5163. [Google Scholar] [CrossRef]
  14. Dejace, L.; Laubeuf, N.; Furfaro, I.; Lacour, S.P. Gallium-Based Thin Films for Wearable Human Motion Sensors. Adv. Intell. Syst. 2019, 1, 1900079. [Google Scholar] [CrossRef] [Green Version]
  15. Gao, Y.; Ota, H.; Schaler, E.W.; Chen, K.; Zhao, A.; Gao, W.; Fahad, H.M.; Leng, Y.; Zheng, A.; Xiong, F.; et al. Wearable Microfluidic Diaphragm Pressure Sensor for Health and Tactile Touch Monitoring. Adv. Mater. 2017, 29, 1701985. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Kang, T.-H.; Chang, H.; Choi, D.; Kim, S.; Moon, J.; Lim, J.A.; Lee, K.-Y.; Yi, H. Hydrogel-Templated Transfer-Printing of Conductive Nanonetworks for Wearable Sensors on Topographic Flexible Substrates. Nano Lett. 2019, 19, 3684–3691. [Google Scholar] [CrossRef] [PubMed]
  17. Cao, Y.; Li, T.; Gu, Y.; Luo, H.; Wang, S.; Zhang, T. Fingerprint-Inspired Flexible Tactile Sensor for Accurately Discerning Surface Texture. Small 2018, 14, 1703902. [Google Scholar] [CrossRef] [PubMed]
  18. Kenry, J.C.Y.; Yu, J.; Shang, M.; Loh, K.P.; Lim, C.T. Highly Flexible Graphene Oxide Nanosuspension Liquid-Based Microfluidic Tactile Sensor. Small 2016, 12, 1593–1604. [CrossRef]
  19. Wang, S.-S.; Liu, H.-B.; Kan, X.-N.; Wang, L.; Chen, Y.-H.; Su, B.; Li, Y.-L.; Jiang, L. Superlyophilicity-Facilitated Synthesis Reaction at the Microscale: Ordered Graphdiyne Stripe Arrays. Small 2017, 13, 1602265. [Google Scholar] [CrossRef]
  20. Wang, Y.; Wang, Y.; Yang, Y. Graphene-Polymer Nanocomposite-Based Redox-Induced Electricity for Flexible Self-Powered Strain Sensors. Adv. Energy Mater. 2018, 8, 1800961. [Google Scholar] [CrossRef]
  21. Tutika, R.; Kmiec, S.; Haque, A.B.M.T.; Martin, S.W.; Bartlett, M.D. Liquid Metal–Elastomer Soft Composites with Independently Controllable and Highly Tunable Droplet Size and Volume Loading. ACS Appl. Mater. Interfaces 2019, 11, 17873–17883. [Google Scholar] [CrossRef] [Green Version]
  22. Navaraj, W.; Dahiya, R. Fingerprint-Enhanced Capacitive-Piezoelectric Flexible Sensing Skin to Discriminate Static and Dynamic Tactile Stimuli. Adv. Intell. Syst. 2019, 1, 1900051. [Google Scholar] [CrossRef]
  23. Lee, J.; Kwon, H.; Seo, J.; Shin, S.; Koo, J.H.; Pang, C.; Son, S.; Kim, J.H.; Jang, Y.H.; Kim, D.E.; et al. Conductive Fiber-Based Ultrasensitive Textile Pressure Sensor for Wearable Electronics. Adv. Mater. 2015, 27, 2433–2439. [Google Scholar] [CrossRef]
  24. Wu, R.; Ma, L.; Hou, C.; Meng, Z.; Guo, W.; Yu, W.; Yu, R.; Hu, F.; Liu, X.Y. Silk Composite Electronic Textile Sensor for High Space Precision 2D Combo Temperature–Pressure Sensing. Small 2019, 15, 1901558. [Google Scholar] [CrossRef] [PubMed]
  25. Leber, A.; Cholst, B.; Sandt, J.; Vogel, N.; Kolle, M. Stretchable Thermoplastic Elastomer Optical Fibers for Sensing of Extreme Deformations. Adv. Funct. Mater. 2019, 29, 1802629. [Google Scholar] [CrossRef]
  26. Guo, J.; Liu, X.; Jiang, N.; Yetisen, A.K.; Yuk, H.; Yang, C.; Khademhosseini, A.; Zhao, X.; Yun, S.-H. Highly Stretchable, Strain Sensing Hydrogel Optical Fibers. Adv. Mater. 2016, 28, 10244–10249. [Google Scholar] [CrossRef]
  27. Bai, H.; Li, S.; Barreiros, J.; Tu, Y.; Pollock, C.R.; Shepherd, R.F. Stretchable Distributed Fiber-Optic Sensors. Science. 2020, 370, 848–852. [Google Scholar] [CrossRef]
  28. Zhou, H.; Zhang, Y.; Qiu, Y.; Wu, H.; Qin, W.; Liao, Y.; Yu, Q.; Cheng, H. Stretchable Piezoelectric Energy Harvesters and Self-Powered Sensors for Wearable and Implantable Devices. Biosens. Bioelectron. 2020, 168, 112569. [Google Scholar] [CrossRef]
  29. Park, K.-I.; Jeong, C.K.; Kim, N.K.; Lee, K.J. Stretchable Piezoelectric Nanocomposite Generator. Nano Converg. 2016, 3, 12. [Google Scholar] [CrossRef] [Green Version]
  30. Ding, W.; Wang, A.C.; Wu, C.; Guo, H.; Wang, Z.L. Human-Machine Interfacing Enabled by Triboelectric Nanogenerators and Tribotronics. Adv. Mater. Technol. 2019, 4, 1800487. [Google Scholar] [CrossRef] [Green Version]
  31. Pu, X.; An, S.; Tang, Q.; Guo, H.; Hu, C. Wearable Triboelectric Sensors for Biomedical Monitoring and Human-Machine Interface. iScience 2021, 24, 102027. [Google Scholar] [CrossRef] [PubMed]
  32. Chen, S.; Pang, Y.; Cao, Y.; Tan, X.; Cao, C. Soft Robotic Manipulation System Capable of Stiffness Variation and Dexterous Operation for Safe Human–Machine Interactions. Adv. Mater. Technol. 2021, 6, 2100084. [Google Scholar] [CrossRef]
  33. Hou, C.; Geng, J.; Yang, Z.; Tang, T.; Sun, Y.; Wang, F.; Liu, H.; Chen, T.; Sun, L. A Delta-Parallel-Inspired Human Machine Interface by Using Self-Powered Triboelectric Nanogenerator Toward 3D and VR/AR Manipulations. Adv. Mater. Technol. 2021, 6, 2000912. [Google Scholar] [CrossRef]
  34. Luo, J.; Wang, Z.; Xu, L.; Wang, A.C.; Han, K.; Jiang, T.; Lai, Q.; Bai, Y.; Tang, W.; Fan, F.R.; et al. Flexible and Durable Wood-Based Triboelectric Nanogenerators for Self-Powered Sensing in Athletic Big Data Analytics. Nat. Commun. 2019, 10, 5147. [Google Scholar] [CrossRef] [Green Version]
  35. Huang, J.; Yang, X.; Yu, J.; Han, J.; Jia, C.; Ding, M.; Sun, J.; Cao, X.; Sun, Q.; Wang, Z.L. A Universal and Arbitrary Tactile Interactive System Based on Self-Powered Optical Communication. Nano Energy 2020, 69, 104419. [Google Scholar] [CrossRef]
  36. He, Q.; Wu, Y.; Feng, Z.; Sun, C.; Fan, W.; Zhou, Z.; Meng, K.; Fan, E.; Yang, J. Triboelectric Vibration Sensor for a Human-Machine Interface Built on Ubiquitous Surfaces. Nano Energy 2019, 59, 689–696. [Google Scholar] [CrossRef]
  37. Yang, Y.; Zhang, H.; Lin, Z.-H.; Zhou, Y.S.; Jing, Q.; Su, Y.; Yang, J.; Chen, J.; Hu, C.; Wang, Z.L. Human Skin Based Triboelectric Nanogenerators for Harvesting Biomechanical Energy and as Self-Powered Active Tactile Sensor System. ACS Nano 2013, 7, 9213–9222. [Google Scholar] [CrossRef] [Green Version]
  38. Chen, S.; Jiang, J.; Xu, F.; Gong, S. Crepe Cellulose Paper and Nitrocellulose Membrane-Based Triboelectric Nanogenerators for Energy Harvesting and Self-Powered Human-Machine Interaction. Nano Energy 2019, 61, 69–77. [Google Scholar] [CrossRef]
  39. Fan, X.; Chen, J.; Yang, J.; Bai, P.; Li, Z.; Wang, Z.L. Ultrathin, Rollable, Paper-Based Triboelectric Nanogenerator for Acoustic Energy Harvesting and Self-Powered Sound Recording. ACS Nano 2015, 9, 4236–4243. [Google Scholar] [CrossRef]
  40. Chen, T.; Zhao, M.; Shi, Q.; Yang, Z.; Liu, H.; Sun, L.; Ouyang, J.; Lee, C. Novel Augmented Reality Interface Using a Self-Powered Triboelectric Based Virtual Reality 3D-Control Sensor. Nano Energy 2018, 51, 162–172. [Google Scholar] [CrossRef]
  41. Tang, Y.; Zhou, H.; Sun, X.; Diao, N.; Wang, J.; Zhang, B.; Qin, C.; Liang, E.; Mao, Y. Triboelectric Touch-Free Screen Sensor for Noncontact Gesture Recognizing. Adv. Funct. Mater. 2020, 30, 1907893. [Google Scholar] [CrossRef]
  42. Chen, T.; Shi, Q.; Zhu, M.; He, T.; Yang, Z.; Liu, H.; Sun, L.; Yang, L.; Lee, C. Intuitive-Augmented Human-Machine Multidimensional Nano-Manipulation Terminal Using Triboelectric Stretchable Strip Sensors Based on Minimalist Design. Nano Energy 2019, 60, 440–448. [Google Scholar] [CrossRef]
  43. Pu, X.; Liu, M.; Chen, X.; Sun, J.; Du, C.; Zhang, Y.; Zhai, J.; Hu, W.; Wang, Z.L. Ultrastretchable, Transparent Triboelectric Nanogenerator as Electronic Skin for Biomechanical Energy Harvesting and Tactile Sensing. Sci. Adv. 2017, 3, e1700015. [Google Scholar] [CrossRef]
  44. Ha, M.; Lim, S.; Cho, S.; Lee, Y.; Na, S.; Baig, C.; Ko, H. Skin-Inspired Hierarchical Polymer Architectures with Gradient Stiffness for Spacer-Free, Ultrathin, and Highly Sensitive Triboelectric Sensors. ACS Nano 2018, 12, 3964–3974. [Google Scholar] [CrossRef] [PubMed]
  45. Yoon, H.; Lee, D.; Kim, Y.; Jeon, S.; Jung, J.; Kwak, S.S.; Kim, J.; Kim, S.; Kim, Y.; Kim, S. Mechanoreceptor-Inspired Dynamic Mechanical Stimuli Perception Based on Switchable Ionic Polarization. Adv. Funct. Mater. 2021, 31, 2100649. [Google Scholar] [CrossRef]
  46. Han, Y.; Yi, F.; Jiang, C.; Dai, K.; Xu, Y.; Wang, X.; You, Z. Self-Powered Gait Pattern-Based Identity Recognition by a Soft and Stretchable Triboelectric Band. Nano Energy 2019, 56, 516–523. [Google Scholar] [CrossRef]
  47. Hua, Q.; Sun, J.; Liu, H.; Bao, R.; Yu, R.; Zhai, J.; Pan, C.; Wang, Z.L. Skin-Inspired Highly Stretchable and Conformable Matrix Networks for Multifunctional Sensing. Nat. Commun. 2018, 9, 244. [Google Scholar] [CrossRef]
  48. Fan, F.-R.; Tian, Z.-Q.; Lin Wang, Z. Flexible Triboelectric Generator. Nano Energy 2012, 1, 328–334. [Google Scholar] [CrossRef]
  49. Bae, J.; Lee, J.; Kim, S.; Ha, J.; Lee, B.-S.; Park, Y.; Choong, C.; Kim, J.-B.; Wang, Z.L.; Kim, H.-Y.; et al. Flutter-Driven Triboelectrification for Harvesting Wind Energy. Nat. Commun. 2014, 5, 4929. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Yang, Y.; Zhu, G.; Zhang, H.; Chen, J.; Zhong, X.; Lin, Z.-H.; Su, Y.; Bai, P.; Wen, X.; Wang, Z.L. Triboelectric Nanogenerator for Harvesting Wind Energy and as Self-Powered Wind Vector Sensor System. ACS Nano 2013, 7, 9461–9468. [Google Scholar] [CrossRef]
  51. Xie, Y.; Wang, S.; Lin, L.; Jing, Q.; Lin, Z.-H.; Niu, S.; Wu, Z.; Wang, Z.L. Rotary Triboelectric Nanogenerator Based on a Hybridized Mechanism for Harvesting Wind Energy. ACS Nano 2013, 7, 7119–7125. [Google Scholar] [CrossRef]
  52. Chen, B.; Yang, Y.; Wang, Z.L. Scavenging Wind Energy by Triboelectric Nanogenerators. Adv. Energy Mater. 2018, 8, 1702649. [Google Scholar] [CrossRef]
  53. Jiang, Q.; Chen, B.; Zhang, K.; Yang, Y. Ag Nanoparticle-Based Triboelectric Nanogenerator To Scavenge Wind Energy for a Self-Charging Power Unit. ACS Appl. Mater. Interfaces 2017, 9, 43716–43723. [Google Scholar] [CrossRef] [PubMed]
  54. Wang, Z.L.; Jiang, T.; Xu, L. Toward the Blue Energy Dream by Triboelectric Nanogenerator Networks. Nano Energy 2017, 39, 9–23. [Google Scholar] [CrossRef]
  55. Chen, J.; Yang, J.; Li, Z.; Fan, X.; Zi, Y.; Jing, Q.; Guo, H.; Wen, Z.; Pradel, K.C.; Niu, S.; et al. Networks of Triboelectric Nanogenerators for Harvesting Water Wave Energy: A Potential Approach toward Blue Energy. ACS Nano 2015, 9, 3324–3331. [Google Scholar] [CrossRef]
  56. Liu, L.; Shi, Q.; Lee, C. A Novel Hybridized Blue Energy Harvester Aiming at All-Weather IoT Applications. Nano Energy 2020, 76, 105052. [Google Scholar] [CrossRef]
  57. Liu, L.; Shi, Q.; Ho, J.S.; Lee, C. Study of Thin Film Blue Energy Harvester Based on Triboelectric Nanogenerator and Seashore IoT Applications. Nano Energy 2019, 66, 104167. [Google Scholar] [CrossRef]
  58. Hou, C.; Chen, T.; Li, Y.; Huang, M.; Shi, Q.; Liu, H.; Sun, L.; Lee, C. A Rotational Pendulum Based Electromagnetic/Triboelectric Hybrid-Generator for Ultra-Low-Frequency Vibrations Aiming at Human Motion and Blue Energy Applications. Nano Energy 2019, 63, 103871. [Google Scholar] [CrossRef]
  59. Chen, X.; Gao, L.; Chen, J.; Lu, S.; Zhou, H.; Wang, T.; Wang, A.; Zhang, Z.; Guo, S.; Mu, X.; et al. A Chaotic Pendulum Triboelectric-Electromagnetic Hybridized Nanogenerator for Wave Energy Scavenging and Self-Powered Wireless Sensing System. Nano Energy 2020, 69, 104440. [Google Scholar] [CrossRef]
  60. Liu, L.; Shi, Q.; Lee, C. A Hybridized Electromagnetic-Triboelectric Nanogenerator Designed for Scavenging Biomechanical Energy in Human Balance Control. Nano Res. 2021, 12, 1–9. [Google Scholar]
  61. Niu, S.; Wang, X.; Yi, F.; Zhou, Y.S.; Wang, Z.L. A Universal Self-Charging System Driven by Random Biomechanical Energy for Sustainable Operation of Mobile Electronics. Nat. Commun. 2015, 6, 8975. [Google Scholar] [CrossRef]
  62. Wang, J.; Li, S.; Yi, F.; Zi, Y.; Lin, J.; Wang, X.; Xu, Y.; Wang, Z.L. Sustainably Powering Wearable Electronics Solely by Biomechanical Energy. Nat. Commun. 2016, 7, 12744. [Google Scholar] [CrossRef] [Green Version]
  63. He, T.; Guo, X.; Lee, C. Flourishing Energy Harvesters for Future Body Sensor Network: From Single to Multiple Energy Sources. iScience 2021, 24, 101934. [Google Scholar] [CrossRef]
  64. Zhang, K.; Wang, X.; Yang, Y.; Wang, Z.L. Hybridized Electromagnetic–Triboelectric Nanogenerator for Scavenging Biomechanical Energy for Sustainably Powering Wearable Electronics. ACS Nano 2015, 9, 3521–3529. [Google Scholar] [CrossRef]
  65. Khalifa, S.; Lan, G.; Hassan, M.; Seneviratne, A.; Das, S.K. HARKE: Human Activity Recognition from Kinetic Energy Harvesting Data in Wearable Devices. IEEE Trans. Mob. Comput. 2018, 17, 1353–1368. [Google Scholar] [CrossRef]
  66. Zhang, Q.; Jiang, T.; Ho, D.; Qin, S.; Yang, X.; Cho, J.H.; Sun, Q.; Wang, Z.L. Transparent and Self-Powered Multistage Sensation Matrix for Mechanosensation Application. ACS Nano 2018, 12, 254–262. [Google Scholar] [CrossRef]
  67. Chen, Z.; Wang, Z.; Li, X.; Lin, Y.; Luo, N.; Long, M.; Zhao, N.; Xu, J.-B. Flexible Piezoelectric-Induced Pressure Sensors for Static Measurements Based on Nanowires/Graphene Heterostructures. ACS Nano 2017, 11, 4507–4513. [Google Scholar] [CrossRef] [PubMed]
  68. Chun, K.-Y.; Son, Y.J.; Jeon, E.-S.; Lee, S.; Han, C.-S. A Self-Powered Sensor Mimicking Slow- and Fast-Adapting Cutaneous Mechanoreceptors. Adv. Mater. 2018, 30, 1706299. [Google Scholar] [CrossRef]
  69. Ha, M.; Lim, S.; Park, J.; Um, D.-S.; Lee, Y.; Ko, H. Bioinspired Interlocked and Hierarchical Design of ZnO Nanowire Arrays for Static and Dynamic Pressure-Sensitive Electronic Skins. Adv. Funct. Mater. 2015, 25, 2841–2849. [Google Scholar] [CrossRef]
  70. Shi, Q.; He, T.; Lee, C. More than Energy Harvesting—Combining Triboelectric Nanogenerator and Flexible Electronics Technology for Enabling Novel Micro-/Nano-Systems. Nano Energy 2019, 57, 851–871. [Google Scholar] [CrossRef]
  71. Zhu, J.; Zhu, M.; Shi, Q.; Wen, F.; Liu, L.; Dong, B.; Haroun, A.; Yang, Y.; Vachon, P.; Guo, X.; et al. Progress in TENG Technology—A Journey from Energy Harvesting to Nanoenergy and Nanosystem. EcoMat 2020, 2, 1–45. [Google Scholar] [CrossRef]
  72. Yang, Y.; Zhou, Y.S.; Zhang, H.; Liu, Y.; Lee, S.; Wang, Z.L. A Single-Electrode Based Triboelectric Nanogenerator as Self-Powered Tracking System. Adv. Mater. 2013, 25, 6594–6601. [Google Scholar] [CrossRef]
  73. Ji, X.; Zhao, T.; Zhao, X.; Lu, X.; Li, T. Triboelectric Nanogenerator Based Smart Electronics via Machine Learning. Adv. Mater. Technol. 2020, 5, 1900921. [Google Scholar] [CrossRef]
  74. Yang, J.; Chen, J.; Liu, Y.; Yang, W.; Su, Y.; Wang, Z.L. Triboelectrification-Based Organic Film Nanogenerator for Acoustic Energy Harvesting and Self-Powered Active Acoustic Sensing. ACS Nano 2014, 8, 2649–2657. [Google Scholar] [CrossRef] [PubMed]
  75. Peng, X.; Dong, K.; Ye, C.; Jiang, Y.; Zhai, S.; Cheng, R.; Liu, D.; Gao, X.; Wang, J.; Wang, Z.L. A Breathable, Biodegradable, Antibacterial, and Self-Powered Electronic Skin Based on All-Nanofiber Triboelectric Nanogenerators. Sci. Adv. 2020, 6, eaba9624. [Google Scholar] [CrossRef]
  76. Chun, S.; Son, W.; Kim, H.; Lim, S.K.; Pang, C.; Choi, C. Self-Powered Pressure- and Vibration-Sensitive Tactile Sensors for Learning Technique-Based Neural Finger Skin. Nano Lett. 2019, 19, 3305–3312. [Google Scholar] [CrossRef] [PubMed]
  77. Xiang, S.; Liu, D.; Jiang, C.; Zhou, W.; Ling, D.; Zheng, W.; Sun, X.; Li, X.; Mao, Y.; Shan, C. Liquid-Metal-Based Dynamic Thermoregulating and Self-Powered Electronic Skin. Adv. Funct. Mater. 2021, 31, 2100940. [Google Scholar] [CrossRef]
  78. Zhu, M.; Shi, Q.; He, T.; Yi, Z.; Ma, Y.; Yang, B.; Chen, T.; Lee, C. Self-Powered and Self-Functional Cotton Sock Using Piezoelectric and Triboelectric Hybrid Mechanism for Healthcare and Sports Monitoring. ACS Nano 2019, 13, 1940–1952. [Google Scholar] [CrossRef] [PubMed]
  79. Zhang, P.; Zhang, Z.; Cai, J. A Foot Pressure Sensor Based on Triboelectric Nanogenerator for Human Motion Monitoring. Microsyst. Technol. 2021, 27, 3507–3512. [Google Scholar] [CrossRef]
  80. Zhang, B.; Tang, Y.; Dai, R.; Wang, H.; Sun, X.; Qin, C.; Pan, Z.; Liang, E.; Mao, Y. Breath-Based Human–Machine Interaction System Using Triboelectric Nanogenerator. Nano Energy 2019, 64, 103953. [Google Scholar] [CrossRef]
  81. Nweke, H.F.; Teh, Y.W.; Al-garadi, M.A.; Alo, U.R. Deep Learning Algorithms for Human Activity Recognition Using Mobile and Wearable Sensor Networks: State of the Art and Research Challenges. Expert Syst. Appl. 2018, 105, 233–261. [Google Scholar] [CrossRef]
  82. Zhou, Y.; Shen, M.; Cui, X.; Shao, Y.; Li, L.; Zhang, Y. Triboelectric Nanogenerator Based Self-Powered Sensor for Artificial Intelligence. Nano Energy 2021, 84, 105887. [Google Scholar] [CrossRef]
  83. Berman, S.; Stern, H. Sensors for Gesture Recognition Systems. IEEE Trans. Syst. Man, Cybern. Part C (Appl. Rev.) 2012, 42, 277–290. [Google Scholar] [CrossRef]
  84. Alhamada, A.I.; Khalifa, O.O.; Abdalla, A.H. Deep Learning for Environmentally Robust Speech Recognition. AIP Conf. Proc. 2020, 2306, 020025. [Google Scholar]
  85. Moeslund, T.B.; Hilton, A.; Krüger, V. A Survey of Advances in Vision-Based Human Motion Capture and Analysis. Comput. Vis. Image Underst. 2006, 104, 90–126. [Google Scholar] [CrossRef]
  86. Sundararajan, K.; Woodard, D.L. Deep Learning for Biometrics. ACM Comput. Surv. 2018, 51, 1–34. [Google Scholar] [CrossRef]
  87. Hughes, J.; Spielberg, A.; Chounlakone, M.; Chang, G.; Matusik, W.; Rus, D. A Simple, Inexpensive, Wearable Glove with Hybrid Resistive-Pressure Sensors for Computational Sensing, Proprioception, and Task Identification. Adv. Intell. Syst. 2020, 2, 2000002. [Google Scholar] [CrossRef]
  88. Li, G.; Liu, S.; Wang, L.; Zhu, R. Skin-Inspired Quadruple Tactile Sensors Integrated on a Robot Hand Enable Object Recognition. Sci. Robot. 2020, 5, eabc8134. [Google Scholar] [CrossRef]
  89. Zhang, H.; Cheng, Q.; Lu, X.; Wang, W.; Wang, Z.L.; Sun, C. Detection of Driving Actions on Steering Wheel Using Triboelectric Nanogenerator via Machine Learning. Nano Energy 2021, 79, 105455. [Google Scholar] [CrossRef]
  90. Zhang, W.; Deng, L.; Yang, L.; Yang, P.; Diao, D.; Wang, P.; Wang, Z.L. Multilanguage-Handwriting Self-Powered Recognition Based on Triboelectric Nanogenerator Enabled Machine Learning. Nano Energy 2020, 77, 105174. [Google Scholar] [CrossRef]
  91. Zhang, W.; Wang, P.; Sun, K.; Wang, C.; Diao, D. Intelligently Detecting and Identifying Liquids Leakage Combining Triboelectric Nanogenerator Based Self-Powered Sensor with Machine Learning. Nano Energy 2019, 56, 277–285. [Google Scholar] [CrossRef]
  92. Syu, M.H.; Guan, Y.J.; Lo, W.C.; Fuh, Y.K. Biomimetic and Porous Nanofiber-Based Hybrid Sensor for Multifunctional Pressure Sensing and Human Gesture Identification via Deep Learning Method. Nano Energy 2020, 76, 105029. [Google Scholar] [CrossRef]
  93. Ji, X.; Fang, P.; Xu, B.; Xie, K.; Yue, H.; Luo, X.; Wang, Z.; Zhao, X.; Shi, P. Biohybrid Triboelectric Nanogenerator for Label-Free Pharmacological Fingerprinting in Cardiomyocytes. Nano Lett. 2020, 20, 4043–4050. [Google Scholar] [CrossRef]
  94. Chen, H.; Miao, L.; Su, Z.; Song, Y.; Han, M.; Chen, X.; Cheng, X.; Chen, D.; Zhang, H. Fingertip-Inspired Electronic Skin Based on Triboelectric Sliding Sensing and Porous Piezoresistive Pressure Detection. Nano Energy 2017, 40, 65–72. [Google Scholar] [CrossRef]
  95. Song, K.; Zhao, R.; Wang, Z.L.; Yang, Y. Conjuncted Pyro-Piezoelectric Effect for Self-Powered Simultaneous Temperature and Pressure Sensing. Adv. Mater. 2019, 31, 1902831. [Google Scholar] [CrossRef]
  96. Wang, S.; Wang, Z.L.; Yang, Y. A One-Structure-Based Hybridized Nanogenerator for Scavenging Mechanical and Thermal Energies by Triboelectric-Piezoelectric-Pyroelectric Effects. Adv. Mater. 2016, 28, 2881–2887. [Google Scholar] [CrossRef] [PubMed]
  97. Yang, Y.; Guo, W.; Pradel, K.C.; Zhu, G.; Zhou, Y.; Zhang, Y.; Hu, Y.; Lin, L.; Wang, Z.L. Pyroelectric Nanogenerators for Harvesting Thermoelectric Energy. Nano Lett. 2012, 12, 2833–2838. [Google Scholar] [CrossRef] [PubMed]
  98. Yang, Y.; Jung, J.H.; Yun, B.K.; Zhang, F.; Pradel, K.C.; Guo, W.; Wang, Z.L. Flexible Pyroelectric Nanogenerators Using a Composite Structure of Lead-Free KNbO 3 Nanowires. Adv. Mater. 2012, 24, 5357–5362. [Google Scholar] [CrossRef] [PubMed]
  99. Zhang, K.; Wang, Y.; Wang, Z.L.; Yang, Y. Standard and Figure-of-Merit for Quantifying the Performance of Pyroelectric Nanogenerators. Nano Energy 2019, 55, 534–540. [Google Scholar] [CrossRef]
  100. Demain, S.; Metcalf, C.D.; Merrett, G.V.; Zheng, D.; Cunningham, S. A Narrative Review on Haptic Devices: Relating the Physiology and Psychophysical Properties of the Hand to Devices for Rehabilitation in Central Nervous System Disorders. Disabil. Rehabil. Assist. Technol. 2013, 8, 181–189. [Google Scholar] [CrossRef] [Green Version]
  101. Shahid, T.; Gouwanda, D.; Nurzaman, S.G.; Gopalai, A.A. Moving toward Soft Robotics: A Decade Review of the Design of Hand Exoskeletons. Biomimetics 2018, 3, 17. [Google Scholar] [CrossRef] [Green Version]
  102. Zubrycki, I.; Granosik, G. Novel Haptic Device Using Jamming Principle for Providing Kinaesthetic Feedback in Glove-Based Control Interface. J. Intell. Robot. Syst. 2017, 85, 413–429. [Google Scholar] [CrossRef] [Green Version]
  103. Low, J.H.; Lee, W.W.; Khin, P.M.; Thakor, N.V.; Kukreja, S.L.; Ren, H.L.; Yeow, C.H. Hybrid Tele-Manipulation System Using a Sensorized 3-D-Printed Soft Robotic Gripper and a Soft Fabric-Based Haptic Glove. IEEE Robot. Autom. Lett. 2017, 2, 880–887. [Google Scholar] [CrossRef]
  104. Martinez, J.; Garcia, A.; Oliver, M.; Molina, J.P.; Gonzalez, P. Identifying Virtual 3D Geometric Shapes with a Vibrotactile Glove. IEEE Comput. Graph. Appl. 2016, 36, 42–51. [Google Scholar] [CrossRef]
  105. Baldi, T.L.; Scheggi, S.; Meli, L.; Mohammadi, M.; Prattichizzo, D. GESTO: A Glove for Enhanced Sensing and Touching Based on Inertial and Magnetic Sensors for Hand Tracking and Cutaneous Feedback. IEEE Trans. Human-Machine Syst. 2017, 47, 1066–1076. [Google Scholar] [CrossRef]
  106. Liu, H.; Zhang, Z.; Xie, X.; Zhu, Y.; Liu, Y.; Wang, Y.; Zhu, S.-C. High-Fidelity Grasping in Virtual Reality Using a Glove-Based System. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 5180–5186. [Google Scholar]
  107. Hinchet, R.; Vechev, V.; Shea, H.; Hilliges, O. DextrES: Wearable Haptic Feedback for Grasping in VR via a Thin Form-Factor Electrostatic Brake. In Proceedings of the 31st Annual ACM Symposium on User Interface Software and Technology, Berlin, Germany, 14–18 October 2018; ACM: New York, NY, USA, 2018; pp. 901–912. [Google Scholar]
  108. Yem, V.; Kajimoto, H. A Fingertip Glove with Motor Rotational Acceleration Enables Stiffness Perception When Grasping a Virtual Object. In Proceedings of the 20th International Conference on Human Interface and the Management of Information, Las Vegas, NV, USA, 15–20 July 2018; Springer: Cham, Switzerland, 2018; pp. 463–473. [Google Scholar]
  109. Pickering, K.L.; Efendy, M.G.A.; Le, T.M. A Review of Recent Developments in Natural Fibre Composites and Their Mechanical Performance. Compos. Part A Appl. Sci. Manuf. 2016, 83, 98–112. [Google Scholar] [CrossRef] [Green Version]
  110. Voiculescu, I.; Nordin, A.N. Acoustic Wave Based MEMS Devices for Biosensing Applications. Biosens. Bioelectron. 2012, 33, 1–9. [Google Scholar] [CrossRef]
  111. Cazala, F.; Vienney, N.; Stoléru, S. The Cortical Sensory Representation of Genitalia in Women and Men: A Systematic Review. Socioaffective Neurosci. Psychol. 2015, 5, 26428. [Google Scholar] [CrossRef] [Green Version]
  112. Schott, G.D. Penfield’s Homunculus: A Note on Cerebral Cartography. J. Neurol. Neurosurg. Psychiatry 1993, 56, 329–333. [Google Scholar] [CrossRef] [Green Version]
  113. Kim, J.-H.; Thang, N.D.; Kim, T.-S. 3-D Hand Motion Tracking and Gesture Recognition Using a Data Glove. In Proceedings of the 2009 IEEE International Symposium on Industrial Electronics, Seoul, Korea, 5–8 July 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 1013–1018. [Google Scholar]
  114. Lei, L.; Dashun, Q. Design of Data-Glove and Chinese Sign Language Recognition System Based on ARM9. In Proceedings of the 2015 12th IEEE International Conference on Electronic Measurement & Instruments (ICEMI), Qingdao, China, 16–18 July 2015; IEEE: Piscataway, NJ, USA, 2015; Volume 3, pp. 1130–1134. [Google Scholar]
  115. Suzuki, K.; Yataka, K.; Okumiya, Y.; Sakakibara, S.; Sako, K.; Mimura, H.; Inoue, Y. Rapid-Response, Widely Stretchable Sensor of Aligned MWCNT/Elastomer Composites for Human Motion Detection. ACS Sensors 2016, 1, 817–825. [Google Scholar] [CrossRef]
  116. Wei, P.; Yang, X.; Cao, Z.; Guo, X.; Jiang, H.; Chen, Y.; Morikado, M.; Qiu, X.; Yu, D. Flexible and Stretchable Electronic Skin with High Durability and Shock Resistance via Embedded 3D Printing Technology for Human Activity Monitoring and Personal Healthcare. Adv. Mater. Technol. 2019, 4, 1900315. [Google Scholar] [CrossRef]
  117. Kim, S.; Oh, J.; Jeong, D.; Bae, J. Direct Wiring of Eutectic Gallium–Indium to a Metal Electrode for Soft Sensor Systems. ACS Appl. Mater. Interfaces 2019, 11, 20557–20565. [Google Scholar] [CrossRef]
  118. Shi, Q.; Dong, B.; He, T.; Sun, Z.; Zhu, J.; Zhang, Z.; Lee, C. Progress in Wearable Electronics/Photonics—Moving toward the Era of Artificial Intelligence and Internet of Things. InfoMat 2020, 2, 1131–1162. [Google Scholar] [CrossRef]
  119. Lu, C.; Chen, J.; Jiang, T.; Gu, G.; Tang, W.; Wang, Z.L. A Stretchable, Flexible Triboelectric Nanogenerator for Self-Powered Real-Time Motion Monitoring. Adv. Mater. Technol. 2018, 3, 1800021. [Google Scholar] [CrossRef]
  120. Dong, K.; Deng, J.; Ding, W.; Wang, A.C.; Wang, P.; Cheng, C.; Wang, Y.-C.; Jin, L.; Gu, B.; Sun, B.; et al. Versatile Core-Sheath Yarn for Sustainable Biomechanical Energy Harvesting and Real-Time Human-Interactive Sensing. Adv. Energy Mater. 2018, 8, 1801114. [Google Scholar] [CrossRef]
  121. Shi, Q.; Wang, H.; Wang, T.; Lee, C. Self-Powered Liquid Triboelectric Microfluidic Sensor for Pressure Sensing and Finger Motion Monitoring Applications. Nano Energy 2016, 30, 450–459. [Google Scholar] [CrossRef]
  122. Qin, K.; Chen, C.; Pu, X.; Tang, Q.; He, W.; Liu, Y.; Zeng, Q.; Liu, G.; Guo, H.; Hu, C. Magnetic Array Assisted Triboelectric Nanogenerator Sensor for Real-Time Gesture Interaction. Nano Micro Lett. 2021, 13, 51. [Google Scholar] [CrossRef]
  123. Xie, L.; Chen, X.; Wen, Z.; Yang, Y.; Shi, J.; Chen, C.; Peng, M.; Liu, Y.; Sun, X. Spiral Steel Wire Based Fiber-Shaped Stretchable and Tailorable Triboelectric Nanogenerator for Wearable Power Source and Active Gesture Sensor. Nano Micro Lett. 2019, 11, 39. [Google Scholar] [CrossRef] [Green Version]
  124. Zhang, M.; Gao, T.; Wang, J.; Liao, J.; Qiu, Y.; Xue, H.; Shi, Z.; Xiong, Z.; Chen, L. Single BaTiO3 Nanowires-Polymer Fiber Based Nanogenerator. Nano Energy 2015, 11, 510–517. [Google Scholar] [CrossRef]
  125. He, T.; Sun, Z.; Shi, Q.; Zhu, M.; Anaya, D.V.; Xu, M.; Chen, T.; Yuce, M.R.; Thean, A.V.-Y.; Lee, C. Self-Powered Glove-Based Intuitive Interface for Diversified Control Applications in Real/Cyber Space. Nano Energy 2019, 58, 641–651. [Google Scholar] [CrossRef]
  126. Nguyen, V.; Yang, R. Effect of Humidity and Pressure on the Triboelectric Nanogenerator. Nano Energy 2013, 2, 604–608. [Google Scholar] [CrossRef]
  127. Nguyen, V.; Zhu, R.; Yang, R. Environmental Effects on Nanogenerators. Nano Energy 2015, 14, 49–61. [Google Scholar] [CrossRef] [Green Version]
  128. Wen, F.; Sun, Z.; He, T.; Shi, Q.; Zhu, M.; Zhang, Z.; Li, L.; Zhang, T.; Lee, C. Machine Learning Glove Using Self-Powered Conductive Superhydrophobic Triboelectric Textile for Gesture Recognition in VR/AR Applications. Adv. Sci. 2020, 7, 2000261. [Google Scholar] [CrossRef]
  129. He, T.; Shi, Q.; Wang, H.; Wen, F.; Chen, T.; Ouyang, J.; Lee, C. Beyond Energy Harvesting—Multi-Functional Triboelectric Nanosensors on a Textile. Nano Energy 2019, 57, 338–352. [Google Scholar] [CrossRef]
  130. Dhakar, L.; Pitchappa, P.; Tay, F.E.H.; Lee, C. An Intelligent Skin Based Self-Powered Finger Motion Sensor Integrated with Triboelectric Nanogenerator. Nano Energy 2016, 19, 532–540. [Google Scholar] [CrossRef]
  131. Zhou, Z.; Chen, K.; Li, X.; Zhang, S.; Wu, Y.; Zhou, Y.; Meng, K.; Sun, C.; He, Q.; Fan, W.; et al. Sign-to-Speech Translation Using Machine-Learning-Assisted Stretchable Sensor Arrays. Nat. Electron. 2020, 3, 571–578. [Google Scholar] [CrossRef]
  132. Zhu, M.; Sun, Z.; Zhang, Z.; Shi, Q.; He, T.; Liu, H.; Chen, T.; Lee, C. Haptic-Feedback Smart Glove as a Creative Human-Machine Interface (HMI) for Virtual/Augmented Reality Applications. Sci. Adv. 2020, 6, eaaz8693. [Google Scholar] [CrossRef] [PubMed]
  133. Pu, X.; Guo, H.; Tang, Q.; Chen, J.; Feng, L.; Liu, G.; Wang, X.; Xi, Y.; Hu, C.; Wang, Z.L. Rotation Sensing and Gesture Control of a Robot Joint via Triboelectric Quantization Sensor. Nano Energy 2018, 54, 453–460. [Google Scholar] [CrossRef]
  134. Wang, Y.; Wu, H.; Xu, L.; Zhang, H.; Yang, Y.; Wang, Z.L. Hierarchically Patterned Self-Powered Sensors for Multifunctional Tactile Sensing. Sci. Adv. 2020, 6, eabb9083. [Google Scholar] [CrossRef]
  135. Dong, B.; Yang, Y.; Shi, Q.; Xu, S.; Sun, Z.; Zhu, S.; Zhang, Z.; Kwong, D.-L.; Zhou, G.; Ang, K.-W.; et al. Wearable Triboelectric–Human–Machine Interface (THMI) Using Robust Nanophotonic Readout. ACS Nano 2020, 14, 8915–8930. [Google Scholar] [CrossRef] [PubMed]
  136. He, Q.; Wu, Y.; Feng, Z.; Fan, W.; Lin, Z.; Sun, C.; Zhou, Z.; Meng, K.; Wu, W.; Yang, J. An All-Textile Triboelectric Sensor for Wearable Teleoperated Human–Machine Interaction. J. Mater. Chem. A 2019, 7, 26804–26811. [Google Scholar] [CrossRef]
  137. Liao, J.; Zou, Y.; Jiang, D.; Liu, Z.; Qu, X.; Li, Z.; Liu, R.; Fan, Y.; Shi, B.; Li, Z.; et al. Nestable Arched Triboelectric Nanogenerator for Large Deflection Biomechanical Sensing and Energy Harvesting. Nano Energy 2020, 69, 104417. [Google Scholar] [CrossRef]
  138. Lai, Y.; Lu, H.; Wu, H.; Zhang, D.; Yang, J.; Ma, J.; Shamsi, M.; Vallem, V.; Dickey, M.D. Elastic Multifunctional Liquid–Metal Fibers for Harvesting Mechanical and Electromagnetic Energy and as Self-Powered Sensors. Adv. Energy Mater. 2021, 11, 2100411. [Google Scholar] [CrossRef]
  139. Maharjan, P.; Bhatta, T.; Salauddin, M.; Rasel, M.S.; Rahman, M.T.; Rana, S.M.S.; Park, J.Y. A Human Skin-Inspired Self-Powered Flex Sensor with Thermally Embossed Microstructured Triboelectric Layers for Sign Language Interpretation. Nano Energy 2020, 76, 105071. [Google Scholar] [CrossRef]
  140. Yi, F.; Wang, X.; Niu, S.; Li, S.; Yin, Y.; Dai, K.; Zhang, G.; Lin, L.; Wen, Z.; Guo, H.; et al. A Highly Shape-Adaptive, Stretchable Design Based on Conductive Liquid for Energy Harvesting and Self-Powered Biomechanical Monitoring. Sci. Adv. 2016, 2, e1501624. [Google Scholar] [CrossRef] [Green Version]
  141. Wang, X.; Dong, L.; Zhang, H.; Yu, R.; Pan, C.; Wang, Z.L. Recent Progress in Electronic Skin. Adv. Sci. 2015, 2, 1500169. [Google Scholar] [CrossRef]
  142. Yang, J.C.; Mun, J.; Kwon, S.Y.; Park, S.; Bao, Z.; Park, S. Electronic Skin: Recent Progress and Future Prospects for Skin-Attachable Devices for Health Monitoring, Robotics, and Prosthetics. Adv. Mater. 2019, 31, 1904765. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  143. Lee, Y.; Park, J.; Choe, A.; Cho, S.; Kim, J.; Ko, H. Mimicking Human and Biological Skins for Multifunctional Skin Electronics. Adv. Funct. Mater. 2020, 30, 1904523. [Google Scholar] [CrossRef]
  144. Tao, J.; Bao, R.; Wang, X.; Peng, Y.; Li, J.; Fu, S.; Pan, C.; Wang, Z.L. Self-Powered Tactile Sensor Array Systems Based on the Triboelectric Effect. Adv. Funct. Mater. 2019, 29, 1806379. [Google Scholar] [CrossRef]
  145. Chen, H.; Song, Y.; Cheng, X.; Zhang, H. Self-Powered Electronic Skin Based on the Triboelectric Generator. Nano Energy 2019, 56, 252–268. [Google Scholar] [CrossRef]
  146. Chen, H.; Song, Y.; Guo, H.; Miao, L.; Chen, X.; Su, Z.; Zhang, H. Hybrid Porous Micro Structured Finger Skin Inspired Self-Powered Electronic Skin System for Pressure Sensing and Sliding Detection. Nano Energy 2018, 51, 496–503. [Google Scholar] [CrossRef]
  147. Zou, H.; Zhang, Y.; Guo, L.; Wang, P.; He, X.; Dai, G.; Zheng, H.; Chen, C.; Wang, A.C.; Xu, C.; et al. Quantifying the Triboelectric Series. Nat. Commun. 2019, 10, 1427. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  148. Wang, H.; Wu, H.; Hasan, D.; He, T.; Shi, Q.; Lee, C. Self-Powered Dual-Mode Amenity Sensor Based on the Water–Air Triboelectric Nanogenerator. ACS Nano 2017, 11, 10337–10346. [Google Scholar] [CrossRef]
  149. Jin, L.; Tao, J.; Bao, R.; Sun, L.; Pan, C. Self-Powered Real-Time Movement Monitoring Sensor Using Triboelectric Nanogenerator Technology. Sci. Rep. 2017, 7, 10521. [Google Scholar] [CrossRef] [Green Version]
  150. Liu, L.; Guo, X.; Lee, C. Promoting Smart Cities into the 5G Era with Multi-Field Internet of Things (IoT) Applications Powered with Advanced Mechanical Energy Harvesters. Nano Energy 2021, 88, 106304. [Google Scholar] [CrossRef]
  151. Zhu, M.; Yi, Z.; Yang, B.; Lee, C. Making Use of Nanoenergy from Human—Nanogenerator and Self-Powered Sensor Enabled Sustainable Wireless IoT Sensory Systems. Nano Today 2021, 36, 101016. [Google Scholar] [CrossRef]
  152. Zhang, X.-S.; Han, M.; Kim, B.; Bao, J.-F.; Brugger, J.; Zhang, H. All-in-One Self-Powered Flexible Microsystems Based on Triboelectric Nanogenerators. Nano Energy 2018, 47, 410–426. [Google Scholar] [CrossRef]
  153. Liu, Z.; Li, H.; Shi, B.; Fan, Y.; Wang, Z.L.; Li, Z. Wearable and Implantable Triboelectric Nanogenerators. Adv. Funct. Mater. 2019, 29, 1808820. [Google Scholar] [CrossRef]
  154. Li, Z.; Zheng, Q.; Wang, Z.L.; Li, Z. Nanogenerator-Based Self-Powered Sensors for Wearable and Implantable Electronics. Research 2020, 2020, 1–25. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  155. Ameri, S.K.; Kim, M.; Kuang, I.A.; Perera, W.K.; Alshiekh, M.; Jeong, H.; Topcu, U.; Akinwande, D.; Lu, N. Imperceptible Electrooculography Graphene Sensor System for Human–Robot Interface. npj 2D Mater. Appl. 2018, 2, 19. [Google Scholar] [CrossRef]
  156. Wei, L.; Hu, H.; Zhang, Y. Fusing Emg and Visual Data for Hands-Free Control of An Intelligent Wheelchair. Int. J. Hum. Robot. 2011, 08, 707–724. [Google Scholar] [CrossRef]
  157. Choudhari, A.M.; Porwal, P.; Jonnalagedda, V.; Mériaudeau, F. An Electrooculography Based Human Machine Interface for Wheelchair Control. Biocybern. Biomed. Eng. 2019, 39, 673–685. [Google Scholar] [CrossRef]
  158. Pu, X.; Guo, H.; Chen, J.; Wang, X.; Xi, Y.; Hu, C.; Wang, Z.L. Eye Motion Triggered Self-Powered Mechnosensational Communication System Using Triboelectric Nanogenerator. Sci. Adv. 2017, 3, e1700694. [Google Scholar] [CrossRef] [Green Version]
  159. Vera Anaya, D.; He, T.; Lee, C.; Yuce, M.R. Self-Powered Eye Motion Sensor Based on Triboelectric Interaction and near-Field Electrostatic Induction for Wearable Assistive Technologies. Nano Energy 2020, 72, 104675. [Google Scholar] [CrossRef]
  160. Fall, C.L.; Gagnon-Turcotte, G.; Dube, J.-F.; Gagne, J.S.; Delisle, Y.; Campeau-Lecours, A.; Gosselin, C.; Gosselin, B. Wireless SEMG-Based Body–Machine Interface for Assistive Technology Devices. IEEE J. Biomed. Heal. Inform. 2017, 21, 967–977. [Google Scholar] [CrossRef]
  161. Zhou, H.; Li, D.; He, X.; Hui, X.; Guo, H.; Hu, C.; Mu, X.; Wang, Z.L. Bionic Ultra-Sensitive Self-Powered Electromechanical Sensor for Muscle-Triggered Communication Application. Adv. Sci. 2021, 8, 2101020. [Google Scholar] [CrossRef]
  162. Li, W.; Torres, D.; Díaz, R.; Wang, Z.; Wu, C.; Wang, C.; Lin Wang, Z.; Sepúlveda, N. Nanogenerator-Based Dual-Functional and Self-Powered Thin Patch Loudspeaker or Microphone for Flexible Electronics. Nat. Commun. 2017, 8, 15310. [Google Scholar] [CrossRef]
  163. Han, J.H.; Bae, K.M.; Hong, S.K.; Park, H.; Kwak, J.-H.; Wang, H.S.; Joe, D.J.; Park, J.H.; Jung, Y.H.; Hur, S.; et al. Machine Learning-Based Self-Powered Acoustic Sensor for Speaker Recognition. Nano Energy 2018, 53, 658–665. [Google Scholar] [CrossRef]
  164. Liu, H.; Dong, W.; Li, Y.; Li, F.; Geng, J.; Zhu, M.; Chen, T.; Zhang, H.; Sun, L.; Lee, C. An Epidermal SEMG Tattoo-like Patch as a New Human–Machine Interface for Patients with Loss of Voice. Microsyst. Nanoeng. 2020, 6, 16. [Google Scholar] [CrossRef] [Green Version]
  165. Chen, C.; Wen, Z.; Shi, J.; Jian, X.; Li, P.; Yeow, J.T.W.; Sun, X. Micro Triboelectric Ultrasonic Device for Acoustic Energy Transfer and Signal Communication. Nat. Commun. 2020, 11, 4143. [Google Scholar] [CrossRef]
  166. Arora, N.; Zhang, S.L.; Shahmiri, F.; Osorio, D.; Wang, Y.-C.; Gupta, M.; Wang, Z.; Starner, T.; Wang, Z.L.; Abowd, G.D. SATURN: A Thin and Flexible Self-Powered Microphone Leveraging Triboelectric Nanogenerator. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2018, 2, 1–28. [Google Scholar] [CrossRef]
  167. Yuan, M.; Li, C.; Liu, H.; Xu, Q.; Xie, Y. A 3D-Printed Acoustic Triboelectric Nanogenerator for Quarter-Wavelength Acoustic Energy Harvesting and Self-Powered Edge Sensing. Nano Energy 2021, 85, 105962. [Google Scholar] [CrossRef]
  168. Chen, F.; Wu, Y.; Ding, Z.; Xia, X.; Li, S.; Zheng, H.; Diao, C.; Yue, G.; Zi, Y. A Novel Triboelectric Nanogenerator Based on Electrospun Polyvinylidene Fluoride Nanofibers for Effective Acoustic Energy Harvesting and Self-Powered Multifunctional Sensing. Nano Energy 2019, 56, 241–251. [Google Scholar] [CrossRef]
  169. Kang, S.; Cho, S.; Shanker, R.; Lee, H.; Park, J.; Um, D.-S.; Lee, Y.; Ko, H. Transparent and Conductive Nanomembranes with Orthogonal Silver Nanowire Arrays for Skin-Attachable Loudspeakers and Microphones. Sci. Adv. 2018, 4, eaas8772. [Google Scholar] [CrossRef] [Green Version]
  170. Chen, T.; Shi, Q.; Zhu, M.; He, T.; Sun, L.; Yang, L.; Lee, C. Triboelectric Self-Powered Wearable Flexible Patch as 3D Motion Control Interface for Robotic Manipulator. ACS Nano 2018, 12, 11561–11571. [Google Scholar] [CrossRef]
  171. Shi, Q.; Lee, C. Self-Powered Bio-Inspired Spider-Net-Coding Interface Using Single-Electrode Triboelectric Nanogenerator. Adv. Sci. 2019, 6, 1900617. [Google Scholar] [CrossRef] [Green Version]
  172. Zhu, M.; Sun, Z.; Chen, T.; Lee, C. Low Cost Exoskeleton Manipulator Using Bidirectional Triboelectric Sensors Enhanced Multiple Degree of Freedom Sensory System. Nat. Commun. 2021, 12, 2692. [Google Scholar] [CrossRef]
  173. Li, C.; Liu, D.; Xu, C.; Wang, Z.; Shu, S.; Sun, Z.; Tang, W.; Wang, Z.L. Sensing of Joint and Spinal Bending or Stretching via a Retractable and Wearable Badge Reel. Nat. Commun. 2021, 12, 2950. [Google Scholar] [CrossRef]
  174. Park, J.; Lee, Y.; Hong, J.; Lee, Y.; Ha, M.; Jung, Y.; Lim, H.; Kim, S.Y.; Ko, H. Tactile-Direction-Sensitive and Stretchable Electronic Skins Based on Human-Skin-Inspired Interlocked Microstructures. ACS Nano 2014, 8, 12020–12029. [Google Scholar] [CrossRef]
  175. Harada, S.; Kanao, K.; Yamamoto, Y.; Arie, T.; Akita, S.; Takei, K. Fully Printed Flexible Fingerprint-like Three-Axis Tactile and Slip Force and Temperature Sensors for Artificial Skin. ACS Nano 2014, 8, 12851–12857. [Google Scholar] [CrossRef]
  176. Sarwar, M.S.; Dobashi, Y.; Preston, C.; Wyss, J.K.M.; Mirabbasi, S.; Madden, J.D.W. Bend, Stretch, and Touch: Locating a Finger on an Actively Deformed Transparent Sensor Array. Sci. Adv. 2017, 3, e1602200. [Google Scholar] [CrossRef] [Green Version]
  177. Lou, Z.; Li, L.; Wang, L.; Shen, G. Recent Progress of Self-Powered Sensing Systems for Wearable Electronics. Small 2017, 13, 1701791. [Google Scholar] [CrossRef]
  178. Wen, Z.; Yang, Y.; Sun, N.; Li, G.; Liu, Y.; Chen, C.; Shi, J.; Xie, L.; Jiang, H.; Bao, D.; et al. A Wrinkled PEDOT:PSS Film Based Stretchable and Transparent Triboelectric Nanogenerator for Wearable Energy Harvesters and Active Motion Sensors. Adv. Funct. Mater. 2018, 28, 1803684. [Google Scholar] [CrossRef]
  179. Bai, Z.; Xu, Y.; Lee, C.; Guo, J. Autonomously Adhesive, Stretchable, and Transparent Solid-State Polyionic Triboelectric Patch for Wearable Power Source and Tactile Sensor. Adv. Funct. Mater. 2021, 31, 2104365. [Google Scholar] [CrossRef]
  180. Yuan, F.; Liu, S.; Zhou, J.; Wang, S.; Wang, Y.; Xuan, S.; Gong, X. Smart Touchless Triboelectric Nanogenerator towards Safeguard and 3D Morphological Awareness. Nano Energy 2021, 86, 106071. [Google Scholar] [CrossRef]
  181. Song, Y.; Wang, N.; Hu, C.; Wang, Z.L.; Yang, Y. Soft Triboelectric Nanogenerators for Mechanical Energy Scavenging and Self-Powered Sensors. Nano Energy 2021, 84, 105919. [Google Scholar] [CrossRef]
  182. Park, S.; Park, J.; Kim, Y.; Bae, S.; Kim, T.-W.; Park, K.-I.; Hong, B.H.; Jeong, C.K.; Lee, S.-K. Laser-Directed Synthesis of Strain-Induced Crumpled MoS2 Structure for Enhanced Triboelectrification toward Haptic Sensors. Nano Energy 2020, 78, 105266. [Google Scholar] [CrossRef]
  183. He, J.; Xie, Z.; Yao, K.; Li, D.; Liu, Y.; Gao, Z.; Lu, W.; Chang, L.; Yu, X. Trampoline Inspired Stretchable Triboelectric Nanogenerators as Tactile Sensors for Epidermal Electronics. Nano Energy 2021, 81, 105590. [Google Scholar] [CrossRef]
  184. Wang, X.; Zhang, Y.; Zhang, X.; Huo, Z.; Li, X.; Que, M.; Peng, Z.; Wang, H.; Pan, C. A Highly Stretchable Transparent Self-Powered Triboelectric Tactile Sensor with Metallized Nanofibers for Wearable Electronics. Adv. Mater. 2018, 30, 1706738. [Google Scholar] [CrossRef] [PubMed]
  185. An, T.; Anaya, D.V.; Gong, S.; Yap, L.W.; Lin, F.; Wang, R.; Yuce, M.R.; Cheng, W. Self-Powered Gold Nanowire Tattoo Triboelectric Sensors for Soft Wearable Human-Machine Interface. Nano Energy 2020, 77, 105295. [Google Scholar] [CrossRef]
  186. Lee, Y.; Kim, J.; Jang, B.; Kim, S.; Sharma, B.K.; Kim, J.-H.; Ahn, J.-H. Graphene-Based Stretchable/Wearable Self-Powered Touch Sensor. Nano Energy 2019, 62, 259–267. [Google Scholar] [CrossRef]
  187. Chen, X.; Wu, Y.; Shao, J.; Jiang, T.; Yu, A.; Xu, L.; Wang, Z.L. On-Skin Triboelectric Nanogenerator and Self-Powered Sensor with Ultrathin Thickness and High Stretchability. Small 2017, 13, 1702929. [Google Scholar] [CrossRef] [PubMed]
  188. Qiu, C.; Wu, F.; Shi, Q.; Lee, C.; Yuce, M.R. Sensors and Control Interface Methods Based on Triboelectric Nanogenerator in IoT Applications. IEEE Access 2019, 7, 92745–92757. [Google Scholar] [CrossRef]
  189. Shi, Q.; Zhang, Z.; Chen, T.; Lee, C. Minimalist and Multi-Functional Human Machine Interface (HMI) Using a Flexible Wearable Triboelectric Patch. Nano Energy 2019, 62, 355–366. [Google Scholar] [CrossRef]
  190. Shi, Q.; Qiu, C.; He, T.; Wu, F.; Zhu, M.; Dziuban, J.A.; Walczak, R.; Yuce, M.R.; Lee, C. Triboelectric Single-Electrode-Output Control Interface Using Patterned Grid Electrode. Nano Energy 2019, 60, 545–556. [Google Scholar] [CrossRef]
  191. Tang, G.; Shi, Q.; Zhang, Z.; He, T.; Sun, Z.; Lee, C. Hybridized Wearable Patch as a Multi-Parameter and Multi-Functional Human-Machine Interface. Nano Energy 2021, 81, 105582. [Google Scholar] [CrossRef]
  192. Cao, R.; Pu, X.; Du, X.; Yang, W.; Wang, J.; Guo, H.; Zhao, S.; Yuan, Z.; Zhang, C.; Li, C.; et al. Screen-Printed Washable Electronic Textiles as Self-Powered Touch/Gesture Tribo-Sensors for Intelligent Human–Machine Interaction. ACS Nano 2018, 12, 5190–5196. [Google Scholar] [CrossRef] [PubMed]
  193. Lin, X.; Mao, Y.; Li, P.; Bai, Y.; Chen, T.; Wu, K.; Chen, D.; Yang, H.; Yang, L. Ultra-Conformable Ionic Skin with Multi-Modal Sensing, Broad-Spectrum Antimicrobial and Regenerative Capabilities for Smart and Expedited Wound Care. Adv. Sci. 2021, 8, 2004627. [Google Scholar] [CrossRef]
  194. Bouteraa, Y.; Ben Abdallah, I. A Gesture-Based Telemanipulation Control for a Robotic Arm with Biofeedback-Based Grasp. Ind. Robot An Int. J. 2017, 44, 575–587. [Google Scholar] [CrossRef]
  195. Fall, C.L.; Turgeon, P.; Campeau-Lecours, A.; Maheu, V.; Boukadoum, M.; Roy, S.; Massicotte, D.; Gosselin, C.; Gosselin, B. Intuitive Wireless Control of a Robotic Arm for People Living with an Upper Body Disability. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 4399–4402. [Google Scholar]
  196. Miehlbradt, J.; Cherpillod, A.; Mintchev, S.; Coscia, M.; Artoni, F.; Floreano, D.; Micera, S. Correction for Miehlbradt et Al., Data-Driven Body–Machine Interface for the Accurate Control of Drones. Proc. Natl. Acad. Sci. USA 2019, 116, 19209. [Google Scholar]
  197. Gao, S.; He, T.; Zhang, Z.; Ao, H.; Jiang, H.; Lee, C. A Motion Capturing and Energy Harvesting Hybridized Lower-Limb System for Rehabilitation and Sports Applications. Adv. Sci. 2021. [Google Scholar] [CrossRef]
  198. Yang, G.-Z.; Bellingham, J.; Dupont, P.E.; Fischer, P.; Floridi, L.; Full, R.; Jacobstein, N.; Kumar, V.; McNutt, M.; Merrifield, R.; et al. The Grand Challenges of Science Robotics. Sci. Robot. 2018, 3, eaar7650. [Google Scholar] [CrossRef]
  199. Gu, W.; Cao, J.; Dai, S.; Hu, H.; Zhong, Y.; Cheng, G.; Zhang, Z.; Ding, J. Self-Powered Slide Tactile Sensor with Wheel-Belt Structures Based on Triboelectric Effect and Electrostatic Induction. Sensors Actuators A Phys. 2021, 331, 113022. [Google Scholar] [CrossRef]
  200. Rong, X.; Zhao, J.; Guo, H.; Zhen, G.; Yu, J.; Zhang, C.; Dong, G. Material Recognition Sensor Array by Electrostatic Induction and Triboelectric Effects. Adv. Mater. Technol. 2020, 5, 2000641. [Google Scholar] [CrossRef]
  201. Liu, H.; Ji, Z.; Xu, H.; Sun, M.; Chen, T.; Sun, L.; Chen, G.; Wang, Z. Large-Scale and Flexible Self-Powered Triboelectric Tactile Sensing Array for Sensitive Robot Skin. Polymers 2017, 9, 586. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  202. Zhang, C.; Liu, S.; Huang, X.; Guo, W.; Li, Y.; Wu, H. A Stretchable Dual-Mode Sensor Array for Multifunctional Robotic Electronic Skin. Nano Energy 2019, 62, 164–170. [Google Scholar] [CrossRef]
  203. Shi, M.; Zhang, J.; Chen, H.; Han, M.; Shankaregowda, S.A.; Su, Z.; Meng, B.; Cheng, X.; Zhang, H. Self-Powered Analogue Smart Skin. ACS Nano 2016, 10, 4083–4091. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  204. Wang, Z.; Zhang, F.; Yao, T.; Li, N.; Li, X.; Shang, J. Self-Powered Non-Contact Triboelectric Rotation Sensor with Interdigitated Film. Sensors 2020, 20, 4947. [Google Scholar] [CrossRef]
  205. Zhao, X.; Kang, Z.; Liao, Q.; Zhang, Z.; Ma, M.; Zhang, Q.; Zhang, Y. Ultralight, Self-Powered and Self-Adaptive Motion Sensor Based on Triboelectric Nanogenerator for Perceptual Layer Application in Internet of Things. Nano Energy 2018, 48, 312–319. [Google Scholar] [CrossRef]
  206. Xu, Y.; Wang, Z.; Hao, W.; Zhao, W.; Lin, W.; Jin, B.; Ding, N. A Flexible Multimodal Sole Sensor for Legged Robot Sensing Complex Ground Information during Locomotion. Sensors 2021, 21, 5359. [Google Scholar] [CrossRef]
  207. Yao, G.; Xu, L.; Cheng, X.; Li, Y.; Huang, X.; Guo, W.; Liu, S.; Wang, Z.L.; Wu, H. Bioinspired Triboelectric Nanogenerators as Self-Powered Electronic Skin for Robotic Tactile Sensing. Adv. Funct. Mater. 2020, 30, 1907312. [Google Scholar] [CrossRef]
  208. Kanda, T.; Shiomi, M.; Miyashita, Z.; Ishiguro, H.; Hagita, N. An Affective Guide Robot in a Shopping Mall. In Proceedings of the Proceedings of the 4th ACM/IEEE international conference on Human robot interaction—HRI ’09, La Jolla, CA, USA, 9–13 March 2009; ACM Press: New York, NY, USA, 2009; p. 173. [Google Scholar]
  209. Norberto Pires, J. Robot-by-voice: Experiments on Commanding an Industrial Robot Using the Human Voice. Ind. Robot Int. J. 2005, 32, 505–511. [Google Scholar] [CrossRef] [Green Version]
  210. Guo, H.; Pu, X.; Chen, J.; Meng, Y.; Yeh, M.-H.; Liu, G.; Tang, Q.; Chen, B.; Liu, D.; Qi, S.; et al. A Highly Sensitive, Self-Powered Triboelectric Auditory Sensor for Social Robotics and Hearing Aids. Sci. Robot. 2018, 3, eaat2516. [Google Scholar] [CrossRef] [Green Version]
  211. Lange, F.; Bertleff, W.; Suppa, M. Force and Trajectory Control of Industrial Robots in Stiff Contact. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 2927–2934. [Google Scholar]
  212. Cheng, P.; Oelmann, B. Joint-Angle Measurement Using Accelerometers and Gyroscopes—A Survey. IEEE Trans. Instrum. Meas. 2010, 59, 404–414. [Google Scholar] [CrossRef]
  213. Wang, Z.; An, J.; Nie, J.; Luo, J.; Shao, J.; Jiang, T.; Chen, B.; Tang, W.; Wang, Z.L. A Self-Powered Angle Sensor at Nanoradian-Resolution for Robotic Arms and Personalized Medicare. Adv. Mater. 2020, 32, 2001466. [Google Scholar] [CrossRef]
  214. Shintake, J.; Cacucciolo, V.; Floreano, D.; Shea, H. Soft Robotic Grippers. Adv. Mater. 2018, 30, 1707035. [Google Scholar] [CrossRef] [Green Version]
  215. Laschi, C.; Mazzolai, B.; Cianchetti, M. Soft Robotics: Technologies and Systems Pushing the Boundaries of Robot Abilities. Sci. Robot. 2016, 1, eaah3690. [Google Scholar] [CrossRef] [Green Version]
  216. Wang, J.; Gao, D.; Lee, P.S. Recent Progress in Artificial Muscles for Interactive Soft Robotics. Adv. Mater. 2021, 33, 2003088. [Google Scholar] [CrossRef]
  217. Hines, L.; Petersen, K.; Lum, G.Z.; Sitti, M. Soft Actuators for Small-Scale Robotics. Adv. Mater. 2017, 29, 1603483. [Google Scholar] [CrossRef]
  218. Ilami, M.; Bagheri, H.; Ahmed, R.; Skowronek, E.O.; Marvi, H. Materials, Actuators, and Sensors for Soft Bioinspired Robots. Adv. Mater. 2021, 33, 2003139. [Google Scholar] [CrossRef] [PubMed]
  219. Rus, D.; Tolley, M.T. Design, Fabrication and Control of Soft Robots. Nature 2015, 521, 467–475. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  220. Chen, J.; Chen, B.; Han, K.; Tang, W.; Wang, Z.L. A Triboelectric Nanogenerator as a Self-Powered Sensor for a Soft–Rigid Hybrid Actuator. Adv. Mater. Technol. 2019, 4, 1900337. [Google Scholar] [CrossRef]
  221. Chen, S.; Pang, Y.; Yuan, H.; Tan, X.; Cao, C. Smart Soft Actuators and Grippers Enabled by Self-Powered Tribo-Skins. Adv. Mater. Technol. 2020, 5, 1901075. [Google Scholar] [CrossRef]
  222. Zhu, M.; Xie, M.; Lu, X.; Okada, S.; Kawamura, S. A Soft Robotic Finger with Self-Powered Triboelectric Curvature Sensor Based on Multi-Material 3D Printing. Nano Energy 2020, 73, 104772. [Google Scholar] [CrossRef]
  223. Chen, J.; Han, K.; Luo, J.; Xu, L.; Tang, W.; Wang, Z.L. Soft Robots with Self-Powered Configurational Sensing. Nano Energy 2020, 77, 105171. [Google Scholar] [CrossRef]
  224. Lai, Y.-C.; Deng, J.; Liu, R.; Hsiao, Y.-C.; Zhang, S.L.; Peng, W.; Wu, H.-M.; Wang, X.; Wang, Z.L. Actively Perceiving and Responsive Soft Robots Enabled by Self-Powered, Highly Extensible, and Highly Sensitive Triboelectric Proximity- and Pressure-Sensing Skins. Adv. Mater. 2018, 30, 1801114. [Google Scholar] [CrossRef] [PubMed]
  225. Jin, T.; Sun, Z.; Li, L.; Zhang, Q.; Zhu, M.; Zhang, Z.; Yuan, G.; Chen, T.; Tian, Y.; Hou, X.; et al. Triboelectric Nanogenerator Sensors for Soft Robotics Aiming at Digital Twin Applications. Nat. Commun. 2020, 11, 5381. [Google Scholar] [CrossRef] [PubMed]
  226. Bu, T.; Xiao, T.; Yang, Z.; Liu, G.; Fu, X.; Nie, J.; Guo, T.; Pang, Y.; Zhao, J.; Xi, F.; et al. Stretchable Triboelectric-Photonic Smart Skin for Tactile and Gesture Sensing. Adv. Mater. 2018, 30, 1800066. [Google Scholar] [CrossRef]
  227. Wu, X.; Zhu, J.; Evans, J.W.; Arias, A.C. A Single-Mode, Self-Adapting, and Self-Powered Mechanoreceptor Based on a Potentiometric–Triboelectric Hybridized Sensing Mechanism for Resolving Complex Stimuli. Adv. Mater. 2020, 32, 2005970. [Google Scholar] [CrossRef]
  228. An, J.; Chen, P.; Wang, Z.; Berbille, A.; Pang, H.; Jiang, Y.; Jiang, T.; Wang, Z.L. Biomimetic Hairy Whiskers for Robotic Skin Tactility. Adv. Mater. 2021, 33, 2101891. [Google Scholar] [CrossRef] [PubMed]
  229. Liu, Z.; Wang, Y.; Ren, Y.; Jin, G.; Zhang, C.; Chen, W.; Yan, F. Poly(Ionic Liquid) Hydrogel-Based Anti-Freezing Ionic Skin for a Soft Robotic Gripper. Mater. Horizons 2020, 7, 919–927. [Google Scholar] [CrossRef]
  230. Jin, G.; Sun, Y.; Geng, J.; Yuan, X.; Chen, T.; Liu, H.; Wang, F.; Sun, L. Bioinspired Soft Caterpillar Robot with Ultra-Stretchable Bionic Sensors Based on Functional Liquid Metal. Nano Energy 2021, 84, 105896. [Google Scholar] [CrossRef]
  231. Garcia, E.; Jimenez, M.A.; De Santos, P.G.; Armada, M. The Evolution of Robotics Research. IEEE Robot. Autom. Mag. 2007, 14, 90–103. [Google Scholar] [CrossRef]
  232. Kim, J.-H.; Sharma, G.; Iyengar, S.S. FAMPER: A Fully Autonomous Mobile Robot for Pipeline Exploration. In Proceedings of the 2010 IEEE International Conference on Industrial Technology, Viña del Mar, Chile, 14–17 March 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 517–523. [Google Scholar]
  233. Wang, Y.; Jiang, Y.; Wu, H.; Yang, Y. Floating Robotic Insects to Obtain Electric Energy from Water Surface for Realizing Some Self-Powered Functions. Nano Energy 2019, 63, 103810. [Google Scholar] [CrossRef]
  234. Wang, Y.; Dai, M.; Wu, H.; Xu, L.; Zhang, T.; Chen, W.; Wang, Z.L.; Yang, Y. Moisture Induced Electricity for Self-Powered Microrobots. Nano Energy 2021, 106499. [Google Scholar] [CrossRef]
  235. Asadnia, M.; Kottapalli, A.G.P.; Miao, J.; Warkiani, M.E.; Triantafyllou, M.S. Artificial Fish Skin of Self-Powered Micro-Electromechanical Systems Hair Cells for Sensing Hydrodynamic Flow Phenomena. J.R. Soc. Interface 2015, 12, 20150322. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  236. Shih, B.; Shah, D.; Li, J.; Thuruthel, T.G.; Park, Y.-L.; Iida, F.; Bao, Z.; Kramer-Bottiglio, R.; Tolley, M.T. Electronic Skins and Machine Learning for Intelligent Soft Robots. Sci. Robot. 2020, 5, eaaz9239. [Google Scholar] [CrossRef]
  237. Lu, H.; Hong, Y.; Yang, Y.; Yang, Z.; Shen, Y. Battery-Less Soft Millirobot That Can Move, Sense, and Communicate Remotely by Coupling the Magnetic and Piezoelectric Effects. Adv. Sci. 2020, 7, 2000069. [Google Scholar] [CrossRef] [PubMed]
  238. Goldoni, R.; Ozkan-Aydin, Y.; Kim, Y.-S.; Kim, J.; Zavanelli, N.; Mahmood, M.; Liu, B.; Hammond, F.L.; Goldman, D.I.; Yeo, W.-H. Stretchable Nanocomposite Sensors, Nanomembrane Interconnectors, and Wireless Electronics toward Feedback–Loop Control of a Soft Earthworm Robot. ACS Appl. Mater. Interfaces 2020, 12, 43388–43397. [Google Scholar] [CrossRef] [PubMed]
  239. Chan, M.; Campo, E.; Estève, D.; Fourniols, J.-Y. Smart Homes—Current Features and Future Perspectives. Maturitas 2009, 64, 90–97. [Google Scholar] [CrossRef]
  240. Demiris, G.; Hensel, B.K. Technologies for an Aging Society: A Systematic Review of “Smart Home” Applications. Yearb. Med. Inform. 2008, 17, 33–40. [Google Scholar]
  241. Ding, D.; Cooper, R.A.; Pasquina, P.F.; Fici-Pasquina, L. Sensor Technology for Smart Homes. Maturitas 2011, 69, 131–136. [Google Scholar] [CrossRef] [PubMed]
  242. Haroun, A.; Le, X.; Gao, S.; Dong, B.; He, T.; Zhang, Z.; Wen, F.; Xu, S.; Lee, C. Progress in Micro/Nano Sensors and Nanoenergy for Future AIoT-Based Smart Home Applications. Nano Express 2021, 2, 022005. [Google Scholar] [CrossRef]
  243. Anaya, D.V.; Zhan, K.; Tao, L.; Lee, C.; Yuce, M.R.; Alan, T. Contactless Tracking of Humans Using Non-Contact Triboelectric Sensing Technology: Enabling New Assistive Applications for The Elderly and The Visually Impaired. Nano Energy 2021, 106486. [Google Scholar] [CrossRef]
  244. Yan, Z.; Wang, L.; Xia, Y.; Qiu, R.; Liu, W.; Wu, M.; Zhu, Y.; Zhu, S.; Jia, C.; Zhu, M.; et al. Flexible High-Resolution Triboelectric Sensor Array Based on Patterned Laser-Induced Graphene for Self-Powered Real-Time Tactile Sensing. Adv. Funct. Mater. 2021, 31, 2100709. [Google Scholar] [CrossRef]
  245. Sala de Medeiros, M.; Chanci, D.; Martinez, R.V. Moisture-Insensitive, Self-Powered Paper-Based Flexible Electronics. Nano Energy 2020, 78, 105301. [Google Scholar] [CrossRef]
  246. Yuan, Z.; Zhou, T.; Yin, Y.; Cao, R.; Li, C.; Wang, Z.L. Transparent and Flexible Triboelectric Sensing Array for Touch Security Applications. ACS Nano 2017, 11, 8364–8369. [Google Scholar] [CrossRef]
  247. Liu, Y.; Yang, W.; Yan, Y.; Wu, X.; Wang, X.; Zhou, Y.; Hu, Y.; Chen, H.; Guo, T. Self-Powered High-Sensitivity Sensory Memory Actuated by Triboelectric Sensory Receptor for Real-Time Neuromorphic Computing. Nano Energy 2020, 75, 104930. [Google Scholar] [CrossRef]
  248. Pu, X.; Tang, Q.; Chen, W.; Huang, Z.; Liu, G.; Zeng, Q.; Chen, J.; Guo, H.; Xin, L.; Hu, C. Flexible Triboelectric 3D Touch Pad with Unit Subdivision Structure for Effective XY Positioning and Pressure Sensing. Nano Energy 2020, 76, 105047. [Google Scholar] [CrossRef]
  249. Qiu, C.; Wu, F.; Lee, C.; Yuce, M.R. Self-Powered Control Interface Based on Gray Code with Hybrid Triboelectric and Photovoltaics Energy Harvesting for IoT Smart Home and Access Control Applications. Nano Energy 2020, 70, 104456. [Google Scholar] [CrossRef]
  250. Yuan, Z.; Du, X.; Li, N.; Yin, Y.; Cao, R.; Zhang, X.; Zhao, S.; Niu, H.; Jiang, T.; Xu, W.; et al. Triboelectric-Based Transparent Secret Code. Adv. Sci. 2018, 5, 1700881. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  251. Chen, J.; Pu, X.; Guo, H.; Tang, Q.; Feng, L.; Wang, X.; Hu, C. A Self-Powered 2D Barcode Recognition System Based on Sliding Mode Triboelectric Nanogenerator for Personal Identification. Nano Energy 2018, 43, 253–258. [Google Scholar] [CrossRef]
  252. Lin, Z.; Yang, J.; Li, X.; Wu, Y.; Wei, W.; Liu, J.; Chen, J.; Yang, J. Large-Scale and Washable Smart Textiles Based on Triboelectric Nanogenerator Arrays for Self-Powered Sleeping Monitoring. Adv. Funct. Mater. 2018, 28, 1704112. [Google Scholar] [CrossRef]
  253. Dong, K.; Peng, X.; An, J.; Wang, A.C.; Luo, J.; Sun, B.; Wang, J.; Wang, Z.L. Shape Adaptable and Highly Resilient 3D Braided Triboelectric Nanogenerators as E-Textiles for Power and Sensing. Nat. Commun. 2020, 11, 2868. [Google Scholar] [CrossRef]
  254. Shi, Q.; Zhang, Z.; He, T.; Sun, Z.; Wang, B.; Feng, Y.; Shan, X.; Salam, B.; Lee, C. Deep Learning Enabled Smart Mats as a Scalable Floor Monitoring System. Nat. Commun. 2020, 11, 4609. [Google Scholar] [CrossRef]
  255. Kato, H.; Tan, K.T. Pervasive 2D Barcodes for Camera Phone Applications. IEEE Pervasive Comput. 2007, 6, 76–85. [Google Scholar] [CrossRef] [Green Version]
  256. Houni, K.; Sawaya, W.; Delignon, Y. One-Dimensional Barcode Reading: An Information Theoretic Approach. Appl. Opt. 2008, 47, 1025. [Google Scholar] [CrossRef] [PubMed]
  257. Sixel-Döring, F.; Zimmermann, J.; Wegener, A.; Mollenhauer, B.; Trenkwalder, C. The Evolution of REM Sleep Behavior Disorder in Early Parkinson Disease. Sleep 2016, 39, 1737–1742. [Google Scholar] [CrossRef] [Green Version]
  258. Jones, M.H.; Goubran, R.; Knoefel, F. Reliable Respiratory Rate Estimation from a Bed Pressure Array. In Proceedings of the 2006 International Conference of the IEEE Engineering in Medicine and Biology Society, New York, NY, USA, 31 August–3 September 2006; IEEE: Piscataway, NJ, USA, 2006; pp. 6410–6413. [Google Scholar]
  259. Samy, L.; Huang, M.-C.; Liu, J.J.; Xu, W.; Sarrafzadeh, M. Unobtrusive Sleep Stage Identification Using a Pressure-Sensitive Bed Sheet. IEEE Sens. J. 2014, 14, 2092–2101. [Google Scholar] [CrossRef]
  260. Kwak, S.S.; Yoon, H.-J.; Kim, S.-W. Textile-Based Triboelectric Nanogenerators for Self-Powered Wearable Electronics. Adv. Funct. Mater. 2019, 29, 1804533. [Google Scholar] [CrossRef]
  261. Dong, K.; Peng, X.; Wang, Z.L. Fiber/Fabric-Based Piezoelectric and Triboelectric Nanogenerators for Flexible/Stretchable and Wearable Electronics and Artificial Intelligence. Adv. Mater. 2020, 32, 1902549. [Google Scholar] [CrossRef]
  262. Paosangthong, W.; Torah, R.; Beeby, S. Recent Progress on Textile-Based Triboelectric Nanogenerators. Nano Energy 2019, 55, 401–423. [Google Scholar] [CrossRef] [Green Version]
  263. Kim, K.-B.; Cho, J.Y.; Jabbar, H.; Ahn, J.H.; Hong, S.D.; Woo, S.B.; Sung, T.H. Optimized Composite Piezoelectric Energy Harvesting Floor Tile for Smart Home Energy Management. Energy Convers. Manag. 2018, 171, 31–37. [Google Scholar] [CrossRef]
  264. Middleton, L.; Buss, A.A.; Bazin, A.; Nixon, M.S. A Floor Sensor System for Gait Recognition. In Proceedings of the Fourth IEEE Workshop on Automatic Identification Advanced Technologies (AutoID’05), Buffalo, NY, USA, 17–18 October 2005; IEEE: Piscataway, NJ, USA, 2005; pp. 171–176. [Google Scholar]
  265. Li, Y.; Gao, Z.; He, Z.; Zhang, P.; Chen, R.; El-Sheimy, N. Multi-Sensor Multi-Floor 3D Localization With Robust Floor Detection. IEEE Access 2018, 6, 76689–76699. [Google Scholar] [CrossRef]
  266. Jeon, S.-B.; Nho, Y.-H.; Park, S.-J.; Kim, W.-G.; Tcho, I.-W.; Kim, D.; Kwon, D.-S.; Choi, Y.-K. Self-Powered Fall Detection System Using Pressure Sensing Triboelectric Nanogenerators. Nano Energy 2017, 41, 139–147. [Google Scholar] [CrossRef]
  267. Ma, J.; Jie, Y.; Bian, J.; Li, T.; Cao, X.; Wang, N. From Triboelectric Nanogenerator to Self-Powered Smart Floor: A Minimalist Design. Nano Energy 2017, 39, 192–199. [Google Scholar] [CrossRef]
  268. Cheng, X.; Song, Y.; Han, M.; Meng, B.; Su, Z.; Miao, L.; Zhang, H. A Flexible Large-Area Triboelectric Generator by Low-Cost Roll-to-Roll Process for Location-Based Monitoring. Sensors Actuators A Phys. 2016, 247, 206–214. [Google Scholar] [CrossRef]
  269. Ha, N.; Xu, K.; Ren, G.; Mitchell, A.; Ou, J.Z. Machine Learning-Enabled Smart Sensor Systems. Adv. Intell. Syst. 2020, 2, 2000063. [Google Scholar] [CrossRef]
  270. Krittanawong, C.; Rogers, A.J.; Johnson, K.W.; Wang, Z.; Turakhia, M.P.; Halperin, J.L.; Narayan, S.M. Integration of Novel Monitoring Devices with Machine Learning Technology for Scalable Cardiovascular Management. Nat. Rev. Cardiol. 2021, 18, 75–91. [Google Scholar] [CrossRef]
  271. Wu, C.; Ding, W.; Liu, R.; Wang, J.; Wang, A.C.; Wang, J.; Li, S.; Zi, Y.; Wang, Z.L. Keystroke Dynamics Enabled Authentication and Identification Using Triboelectric Nanogenerator Array. Mater. Today 2018, 21, 216–222. [Google Scholar] [CrossRef]
  272. Zhang, Z.; He, T.; Zhu, M.; Sun, Z.; Shi, Q.; Zhu, J.; Dong, B.; Yuce, M.R.; Lee, C. Deep Learning-Enabled Triboelectric Smart Socks for IoT-Based Gait Analysis and VR Applications. npj Flex. Electron. 2020, 4, 29. [Google Scholar] [CrossRef]
  273. Wen, F.; Zhang, Z.; He, T.; Lee, C. AI Enabled Sign Language Recognition and VR Space Bidirectional Communication Using Triboelectric Smart Glove. Nat. Commun. 2021, 12, 5378. [Google Scholar] [CrossRef]
  274. Sun, Z.; Zhu, M.; Zhang, Z.; Chen, Z.; Shi, Q.; Shan, X.; Yeow, R.C.H.; Lee, C. Artificial Intelligence of Things (AIoT) Enabled Virtual Shop Applications Using Self-Powered Sensor Enhanced Soft Robotic Manipulator. Adv. Sci. 2021, 8, 2100230. [Google Scholar] [CrossRef] [PubMed]
  275. Wang, M.; Yan, Z.; Wang, T.; Cai, P.; Gao, S.; Zeng, Y.; Wan, C.; Wang, H.; Pan, L.; Yu, J.; et al. Gesture Recognition Using a Bioinspired Learning Architecture That Integrates Visual Data with Somatosensory Data from Stretchable Sensors. Nat. Electron. 2020, 3, 563–570. [Google Scholar] [CrossRef]
  276. Yun, J.; Jayababu, N.; Kim, D. Self-Powered Transparent and Flexible Touchpad Based on Triboelectricity towards Artificial Intelligence. Nano Energy 2020, 78, 105325. [Google Scholar] [CrossRef]
  277. Maharjan, P.; Shrestha, K.; Bhatta, T.; Cho, H.; Park, C.; Salauddin, M.; Rahman, M.T.; Rana, S.S.; Lee, S.; Park, J.Y. Keystroke Dynamics Based Hybrid Nanogenerators for Biometric Authentication and Identification Using Artificial Intelligence. Adv. Sci. 2021, 8, 2100711. [Google Scholar] [CrossRef] [PubMed]
  278. Zhao, X.; Zhang, Z.; Xu, L.; Gao, F.; Zhao, B.; Ouyang, T.; Kang, Z.; Liao, Q.; Zhang, Y. Fingerprint-Inspired Electronic Skin Based on Triboelectric Nanogenerator for Fine Texture Recognition. Nano Energy 2021, 85, 106001. [Google Scholar] [CrossRef]
  279. Moin, A.; Zhou, A.; Rahimi, A.; Menon, A.; Benatti, S.; Alexandrov, G.; Tamakloe, S.; Ting, J.; Yamamoto, N.; Khan, Y.; et al. A Wearable Biosensing System with In-Sensor Adaptive Machine Learning for Hand Gesture Recognition. Nat. Electron. 2021, 4, 54–63. [Google Scholar] [CrossRef]
  280. Luo, Y.; Li, Y.; Sharma, P.; Shou, W.; Wu, K.; Foshey, M.; Li, B.; Palacios, T.; Torralba, A.; Matusik, W. Learning Human–Environment Interactions Using Conformal Tactile Textiles. Nat. Electron. 2021, 4, 193–201. [Google Scholar] [CrossRef]
  281. Sundaram, S.; Kellnhofer, P.; Li, Y.; Zhu, J.-Y.; Torralba, A.; Matusik, W. Learning the Signatures of the Human Grasp Using a Scalable Tactile Glove. Nature 2019, 569, 698–702. [Google Scholar] [CrossRef]
  282. Monrose, F.; Rubin, A.D. Keystroke Dynamics as a Biometric for Authentication. Futur. Gener. Comput. Syst. 2000, 16, 351–359. [Google Scholar] [CrossRef] [Green Version]
  283. Zhao, G.; Yang, J.; Chen, J.; Zhu, G.; Jiang, Z.; Liu, X.; Niu, G.; Wang, Z.L.; Zhang, B. Keystroke Dynamics Identification Based on Triboelectric Nanogenerator for Intelligent Keyboard Using Deep Learning Method. Adv. Mater. Technol. 2019, 4, 1800167. [Google Scholar] [CrossRef] [Green Version]
  284. Chen, J.; Zhu, G.; Yang, J.; Jing, Q.; Bai, P.; Yang, W.; Qi, X.; Su, Y.; Wang, Z.L. Personalized Keystroke Dynamics for Self-Powered Human–Machine Interfacing. ACS Nano 2015, 9, 105–116. [Google Scholar] [CrossRef]
  285. Caldas, R.; Mundt, M.; Potthast, W.; Buarque de Lima Neto, F.; Markert, B. A Systematic Review of Gait Analysis Methods Based on Inertial Sensors and Adaptive Algorithms. Gait Posture 2017, 57, 204–210. [Google Scholar] [CrossRef] [PubMed]
  286. Fang, B.; Sun, F.; Liu, H.; Liu, C. 3D Human Gesture Capturing and Recognition by the IMMU-Based Data Glove. Neurocomputing 2018, 277, 198–207. [Google Scholar] [CrossRef]
  287. Neto, P.; Pereira, D.; Pires, J.N.; Moreira, A.P. Real-Time and Continuous Hand Gesture Spotting: An Approach Based on Artificial Neural Networks. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 178–183. [Google Scholar]
  288. Chiu, C.-M.; Chen, S.-W.; Pao, Y.-P.; Huang, M.-Z.; Chan, S.-W.; Lin, Z.-H. A Smart Glove with Integrated Triboelectric Nanogenerator for Self-Powered Gesture Recognition and Language Expression. Sci. Technol. Adv. Mater. 2019, 20, 964–971. [Google Scholar] [CrossRef] [Green Version]
  289. Debie, E.; Fernandez Rojas, R.; Fidock, J.; Barlow, M.; Kasmarik, K.; Anavatti, S.; Garratt, M.; Abbass, H.A. Multimodal Fusion for Objective Assessment of Cognitive Workload: A Review. IEEE Trans. Cybern. 2021, 51, 1542–1555. [Google Scholar] [CrossRef] [PubMed]
  290. Imran, J.; Raman, B. Evaluating Fusion of RGB-D and Inertial Sensors for Multimodal Human Action Recognition. J. Ambient Intell. Humaniz. Comput. 2020, 11, 189–208. [Google Scholar] [CrossRef]
  291. Xue, T.; Wang, W.; Ma, J.; Liu, W.; Pan, Z.; Han, M. Progress and Prospects of Multimodal Fusion Methods in Physical Human–Robot Interaction: A Review. IEEE Sens. J. 2020, 20, 10355–10370. [Google Scholar] [CrossRef]
  292. Biswas, S.; Visell, Y. Emerging Material Technologies for Haptics. Adv. Mater. Technol. 2019, 4, 1900042. [Google Scholar] [CrossRef] [Green Version]
  293. Rothemund, P.; Kellaris, N.; Mitchell, S.K.; Acome, E.; Keplinger, C. HASEL Artificial Muscles for a New Generation of Lifelike Robots—Recent Progress and Future Opportunities. Adv. Mater. 2021, 33, 2003375. [Google Scholar] [CrossRef]
  294. Yin, J.; Hinchet, R.; Shea, H.; Majidi, C. Wearable Soft Technologies for Haptic Sensing and Feedback. Adv. Funct. Mater. 2020, 2007428. [Google Scholar] [CrossRef]
  295. Xiong, J.; Chen, J.; Lee, P.S. Functional Fibers and Fabrics for Soft Robotics, Wearables, and Human–Robot Interface. Adv. Mater. 2021, 33, 2002640. [Google Scholar] [CrossRef]
  296. Nakamura, T.; Yamamoto, A. Multi-Finger Surface Visuo-Haptic Rendering Using Electrostatic Stimulation with Force-Direction Sensing Gloves. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS), Houston, TX, USA, 23–26 February 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 489–491. [Google Scholar]
  297. Zhao, H.; Hussain, A.M.; Israr, A.; Vogt, D.M.; Duduta, M.; Clarke, D.R.; Wood, R.J. A Wearable Soft Haptic Communicator Based on Dielectric Elastomer Actuators. Soft Robot. 2020, 7, 451–461. [Google Scholar] [CrossRef]
  298. Yang, Y.; Wu, Y.; Li, C.; Yang, X.; Chen, W. Flexible Actuators for Soft Robotics. Adv. Intell. Syst. 2020, 2, 1900077. [Google Scholar] [CrossRef] [Green Version]
  299. Kang, B.B.; Choi, H.; Lee, H.; Cho, K.-J. Exo-Glove Poly II: A Polymer-Based Soft Wearable Robot for the Hand with a Tendon-Driven Actuation System. Soft Robot. 2019, 6, 214–227. [Google Scholar] [CrossRef]
  300. Yu, X.; Xie, Z.; Yu, Y.; Lee, J.; Vazquez-Guardado, A.; Luan, H.; Ruban, J.; Ning, X.; Akhtar, A.; Li, D.; et al. Skin-Integrated Wireless Haptic Interfaces for Virtual and Augmented Reality. Nature 2019, 575, 473–479. [Google Scholar] [CrossRef]
  301. Wang, X.; Mitchell, S.K.; Rumley, E.H.; Rothemund, P.; Keplinger, C. High-Strain Peano-HASEL Actuators. Adv. Funct. Mater. 2020, 30, 1908821. [Google Scholar] [CrossRef]
  302. Ji, X.; Liu, X.; Cacucciolo, V.; Civet, Y.; El Haitami, A.; Cantin, S.; Perriard, Y.; Shea, H. Untethered Feel-Through Haptics Using 18-µm Thick Dielectric Elastomer Actuators. Adv. Funct. Mater. 2020, 2006639. [Google Scholar] [CrossRef]
  303. Pyo, D.; Ryu, S.; Kyung, K.-U.; Yun, S.; Kwon, D.-S. High-Pressure Endurable Flexible Tactile Actuator Based on Microstructured Dielectric Elastomer. Appl. Phys. Lett. 2018, 112, 061902. [Google Scholar] [CrossRef] [Green Version]
  304. Hwang, I.; Kim, H.J.; Mun, S.; Yun, S.; Kang, T.J. A Light-Driven Vibrotactile Actuator with a Polymer Bimorph Film for Localized Haptic Rendering. ACS Appl. Mater. Interfaces 2021, 13, 6597–6605. [Google Scholar] [CrossRef] [PubMed]
  305. Hinchet, R.; Shea, H. High Force Density Textile Electrostatic Clutch. Adv. Mater. Technol. 2020, 5, 1900895. [Google Scholar] [CrossRef]
  306. Leroy, E.; Hinchet, R.; Shea, H. Multimode Hydraulically Amplified Electrostatic Actuators for Wearable Haptics. Adv. Mater. 2020, 32, 2002564. [Google Scholar] [CrossRef]
  307. Besse, N.; Rosset, S.; Zarate, J.J.; Shea, H. Flexible Active Skin: Large Reconfigurable Arrays of Individually Addressed Shape Memory Polymer Actuators. Adv. Mater. Technol. 2017, 2, 1700102. [Google Scholar] [CrossRef]
  308. Qu, X.; Ma, X.; Shi, B.; Li, H.; Zheng, L.; Wang, C.; Liu, Z.; Fan, Y.; Chen, X.; Li, Z.; et al. Refreshable Braille Display System Based on Triboelectric Nanogenerator and Dielectric Elastomer. Adv. Funct. Mater. 2021, 31, 2006612. [Google Scholar] [CrossRef]
  309. Shi, Y.; Wang, F.; Tian, J.; Li, S.; Fu, E.; Nie, J.; Lei, R.; Ding, Y.; Chen, X.; Wang, Z.L. Self-Powered Electro-Tactile System for Virtual Tactile Experiences. Sci. Adv. 2021, 7, eabe2943. [Google Scholar] [CrossRef]
  310. Oh, J.; Kim, S.; Lee, S.; Jeong, S.; Ko, S.H.; Bae, J. A Liquid Metal Based Multimodal Sensor and Haptic Feedback Device for Thermal and Tactile Sensation Generation in Virtual Reality. Adv. Funct. Mater. 2020, 2007772. [Google Scholar] [CrossRef]
  311. Lee, J.; Sul, H.; Lee, W.; Pyun, K.R.; Ha, I.; Kim, D.; Park, H.; Eom, H.; Yoon, Y.; Jung, J.; et al. Stretchable Skin-Like Cooling/Heating Device for Reconstruction of Artificial Thermal Sensation in Virtual Reality. Adv. Funct. Mater. 2020, 30, 1909171. [Google Scholar] [CrossRef]
  312. Kellaris, N.; Gopaluni Venkata, V.; Smith, G.M.; Mitchell, S.K.; Keplinger, C. Peano-HASEL Actuators: Muscle-Mimetic, Electrohydraulic Transducers That Linearly Contract on Activation. Sci. Robot. 2018, 3, eaar3276. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  313. Kim, S.; Kim, T.; Kim, C.S.; Choi, H.; Kim, Y.J.; Lee, G.S.; Oh, O.; Cho, B.J. Two-Dimensional Thermal Haptic Module Based on a Flexible Thermoelectric Device. Soft Robot. 2020, 7, 736–742. [Google Scholar] [CrossRef] [PubMed]
  314. Kim, S.-W.; Kim, S.H.; Kim, C.S.; Yi, K.; Kim, J.-S.; Cho, B.J.; Cha, Y. Thermal Display Glove for Interacting with Virtual Reality. Sci. Rep. 2020, 10, 11403. [Google Scholar] [CrossRef]
  315. Souri, H.; Banerjee, H.; Jusufi, A.; Radacsi, N.; Stokes, A.A.; Park, I.; Sitti, M.; Amjadi, M. Wearable and Stretchable Strain Sensors: Materials, Sensing Mechanisms, and Applications. Adv. Intell. Syst. 2020, 2, 2000039. [Google Scholar] [CrossRef]
  316. Wang, Z.L. Triboelectric Nanogenerators as New Energy Technology and Self-Powered Sensors—Principles, Problems and Perspectives. Faraday Discuss. 2014, 176, 447–458. [Google Scholar] [CrossRef]
  317. Shi, Q.; Sun, Z.; Zhang, Z.; Lee, C. Triboelectric Nanogenerators and Hybridized Systems for Enabling Next-Generation IoT Applications. Research 2021, 2021, 1–30. [Google Scholar]
  318. Zhang, K.; Wang, Y.; Yang, Y. Structure Design and Performance of Hybridized Nanogenerators. Adv. Funct. Mater. 2019, 29, 1806435. [Google Scholar] [CrossRef]
  319. Guo, X.; Liu, L.; Zhang, Z.; Gao, S.; He, T.; Shi, Q.; Lee, C. Technology Evolution from Micro-Scale Energy Harvesters to Nanogenerators. J. Micromech. Microeng. 2021, 31, 093002. [Google Scholar] [CrossRef]
  320. Guo, X.; He, T.; Zhang, Z.; Luo, A.; Wang, F.; Ng, E.J.; Zhu, Y.; Liu, H.; Lee, C. Artificial Intelligence-Enabled Caregiving Walking Stick Powered by Ultra-Low-Frequency Human Motion. ACS Nano 2021. [Google Scholar] [CrossRef] [PubMed]
  321. Mallineni, S.S.K.; Dong, Y.; Behlow, H.; Rao, A.M.; Podila, R. A Wireless Triboelectric Nanogenerator. Adv. Energy Mater. 2018, 8, 1702736. [Google Scholar] [CrossRef] [Green Version]
  322. Wen, F.; Wang, H.; He, T.; Shi, Q.; Sun, Z.; Zhu, M.; Zhang, Z.; Cao, Z.; Dai, Y.; Zhang, T.; et al. Battery-Free Short-Range Self-Powered Wireless Sensor Network (SS-WSN) Using TENG Based Direct Sensory Transmission (TDST) Mechanism. Nano Energy 2020, 67, 104266. [Google Scholar] [CrossRef]
  323. Tan, X.; Zhou, Z.; Zhang, L.; Wang, X.; Lin, Z.; Yang, R.; Yang, J. A Passive Wireless Triboelectric Sensor via a Surface Acoustic Wave Resonator (SAWR). Nano Energy 2020, 78, 105307. [Google Scholar] [CrossRef]
  324. Lin, Z.; Chen, J.; Li, X.; Zhou, Z.; Meng, K.; Wei, W.; Yang, J.; Wang, Z.L. Triboelectric Nanogenerator Enabled Body Sensor Network for Self-Powered Human Heart-Rate Monitoring. ACS Nano 2017, 11, 8830–8837. [Google Scholar] [CrossRef]
  325. He, T.; Wang, H.; Wang, J.; Tian, X.; Wen, F.; Shi, Q.; Ho, J.S.; Lee, C. Self-Sustainable Wearable Textile Nano-Energy Nano-System (NENS) for Next-Generation Healthcare Applications. Adv. Sci. 2019, 6, 1901437. [Google Scholar] [CrossRef] [Green Version]
  326. Wang, L.; He, T.; Zhang, Z.; Zhao, L.; Lee, C.; Luo, G.; Mao, Q.; Yang, P.; Lin, Q.; Li, X.; et al. Self-Sustained Autonomous Wireless Sensing Based on a Hybridized TENG and PEG Vibration Mechanism. Nano Energy 2021, 80, 105555. [Google Scholar] [CrossRef]
  327. Zhang, C.; Chen, J.; Xuan, W.; Huang, S.; You, B.; Li, W.; Sun, L.; Jin, H.; Wang, X.; Dong, S.; et al. Conjunction of Triboelectric Nanogenerator with Induction Coils as Wireless Power Sources and Self-Powered Wireless Sensors. Nat. Commun. 2020, 11, 58. [Google Scholar] [CrossRef] [Green Version]
  328. Chen, J.; Xuan, W.; Zhao, P.; Farooq, U.; Ding, P.; Yin, W.; Jin, H.; Wang, X.; Fu, Y.; Dong, S.; et al. Triboelectric Effect Based Instantaneous Self-Powered Wireless Sensing with Self-Determined Identity. Nano Energy 2018, 51, 1–9. [Google Scholar] [CrossRef]
  329. Bandodkar, A.J.; Gutruf, P.; Choi, J.; Lee, K.; Sekine, Y.; Reeder, J.T.; Jeang, W.J.; Aranyosi, A.J.; Lee, S.P.; Model, J.B.; et al. Battery-Free, Skin-Interfaced Microfluidic/Electronic Systems for Simultaneous Electrochemical, Colorimetric, and Volumetric Analysis of Sweat. Sci. Adv. 2019, 5, eaav3294. [Google Scholar] [CrossRef] [Green Version]
  330. Kim, J.; Banks, A.; Cheng, H.; Xie, Z.; Xu, S.; Jang, K.-I.; Lee, J.W.; Liu, Z.; Gutruf, P.; Huang, X.; et al. Epidermal Electronics with Advanced Capabilities in Near-Field Communication. Small 2015, 11, 906–912. [Google Scholar] [CrossRef] [PubMed]
  331. Fiddes, L.K.; Yan, N. RFID Tags for Wireless Electrochemical Detection of Volatile Chemicals. Sensors Actuators B Chem. 2013, 186, 817–823. [Google Scholar] [CrossRef]
  332. Zhang, J.; Tian, G.; Marindra, A.; Sunny, A.; Zhao, A. A Review of Passive RFID Tag Antenna-Based Sensors and Systems for Structural Health Monitoring Applications. Sensors 2017, 17, 265. [Google Scholar] [CrossRef]
  333. Jin, H.; Tao, X.; Dong, S.; Qin, Y.; Yu, L.; Luo, J.; Deen, M.J. Flexible Surface Acoustic Wave Respiration Sensor for Monitoring Obstructive Sleep Apnea Syndrome. J. Micromech. Microeng. 2017, 27, 115006. [Google Scholar] [CrossRef]
  334. Xu, H.; Dong, S.; Xuan, W.; Farooq, U.; Huang, S.; Li, M.; Wu, T.; Jin, H.; Wang, X.; Luo, J. Flexible Surface Acoustic Wave Strain Sensor Based on Single Crystalline LiNbO 3 Thin Film. Appl. Phys. Lett. 2018, 112, 093502. [Google Scholar] [CrossRef]
  335. Ren, Z.; Xu, J.; Le, X.; Lee, C. Heterogeneous Wafer Bonding Technology and Thin-Film Transfer Technology-Enabling Platform for the Next Generation Applications beyond 5G. Micromachines 2021, 12, 946. [Google Scholar] [CrossRef]
  336. Dong, B.; Shi, Q.; He, T.; Zhu, S.; Zhang, Z.; Sun, Z.; Ma, Y.; Kwong, D.; Lee, C. Wearable Triboelectric/Aluminum Nitride Nano-Energy-Nano-System with Self-Sustainable Photonic Modulation and Continuous Force Sensing. Adv. Sci. 2020, 7, 1903636. [Google Scholar] [CrossRef] [PubMed]
  337. Zhang, Z.; Shi, Q.; He, T.; Guo, X.; Dong, B.; Lee, J.; Lee, C. Artificial Intelligence of Toilet (AI-Toilet) for an Integrated Health Monitoring System (IHMS) using Smart Triboelectric Pressure Sensors and Image Sensor. Nano Energy 2021, 90A, 106517. [Google Scholar] [CrossRef]
Figure 1. Schematic illustration for the development progress of triboelectric human–machine interfaces and their applications in the 5G/IoT era. Reprinted with permission from Reference [32], Copyright 2021, Wiley. Reprinted with permission from Reference [33], Copyright 2020, Wiley. Reprinted with permission from Reference [34], Copyright 2019, Springer Nature. Reprinted with permission from Reference [35], Copyright 2019, Elsevier. Reprinted with permission from Reference [36], Copyright 2019, Elsevier. Reprinted with permission from Reference [37], Copyright 2013, American Chemical Society. Reprinted with permission from Reference [38], Copyright 2019, Elsevier. Reprinted with permission from Reference [39], Copyright 2015, American Chemical Society. Reprinted with permission from Reference [40], Copyright 2018, Elsevier. Reprinted with permission from Reference [41], Copyright 2019, Wiley. Reprinted with permission from Reference [42], Copyright 2019, Elsevier. Reprinted with permission from Reference [43], Copyright 2017, AAAS. Reprinted with permission from Reference [44], Copyright 2018, American Chemical Society. Reprinted with permission from Reference [45], Copyright 2021, Wiley. Reprinted with permission from Reference [46], Copyright 2018, Elsevier. Reprinted with permission from Reference [47], Copyright 2018, Springer Nature.
Figure 1. Schematic illustration for the development progress of triboelectric human–machine interfaces and their applications in the 5G/IoT era. Reprinted with permission from Reference [32], Copyright 2021, Wiley. Reprinted with permission from Reference [33], Copyright 2020, Wiley. Reprinted with permission from Reference [34], Copyright 2019, Springer Nature. Reprinted with permission from Reference [35], Copyright 2019, Elsevier. Reprinted with permission from Reference [36], Copyright 2019, Elsevier. Reprinted with permission from Reference [37], Copyright 2013, American Chemical Society. Reprinted with permission from Reference [38], Copyright 2019, Elsevier. Reprinted with permission from Reference [39], Copyright 2015, American Chemical Society. Reprinted with permission from Reference [40], Copyright 2018, Elsevier. Reprinted with permission from Reference [41], Copyright 2019, Wiley. Reprinted with permission from Reference [42], Copyright 2019, Elsevier. Reprinted with permission from Reference [43], Copyright 2017, AAAS. Reprinted with permission from Reference [44], Copyright 2018, American Chemical Society. Reprinted with permission from Reference [45], Copyright 2021, Wiley. Reprinted with permission from Reference [46], Copyright 2018, Elsevier. Reprinted with permission from Reference [47], Copyright 2018, Springer Nature.
Nanoenergyadv 01 00005 g001
Figure 2. Glove-based HMIs. (a) A smart glove using TENG textile sensor for cursor control and web surfing application. Reprinted with permission from Reference [125], Copyright 2019, Elsevier. (b) A superhydrophobic textile glove enabled by carbon nanotubes/thermoplastic elastomer (CNTs/TPE) coating for VR/AR applications. Reprinted with permission from Reference [128], Copyright 2020, Wiley. (c) A yarn structural TENG strain sensor enabled glove for sign language recognition. Reprinted with permission from Reference [131], Copyright 2020, Springer Nature. (d) A multifunctional glove capable of bending sensing, sliding event detecting, as well as haptic stimulation for augmented AR/VR experiences. Reprinted with permission from Reference [132], Copyright 2020, AAAS. (e) A joint motion TENG quantization sensor enabled glove for the robotic collaborative operation. Reprinted with permission from Reference [133], Copyright 2018, Elsevier. (f) An electronic skin integrating triboelectric and piezoresistive sensing mechanisms for grasping tactile perception. Reprinted with permission from Reference [94], Copyright 2017, Elsevier. (g) A multifunctional fingertip tactile sensor capable of pressure sensing, temperature perception and material identification. Reprinted with permission from Reference [134], Copyright 2020, AAAS. (h) Nanophotonic modulator enabled readout strategy for TENG-based continuous pressure sensing. Reprinted with permission from Reference [135], Copyright 2020, American Chemical Society.
Figure 2. Glove-based HMIs. (a) A smart glove using TENG textile sensor for cursor control and web surfing application. Reprinted with permission from Reference [125], Copyright 2019, Elsevier. (b) A superhydrophobic textile glove enabled by carbon nanotubes/thermoplastic elastomer (CNTs/TPE) coating for VR/AR applications. Reprinted with permission from Reference [128], Copyright 2020, Wiley. (c) A yarn structural TENG strain sensor enabled glove for sign language recognition. Reprinted with permission from Reference [131], Copyright 2020, Springer Nature. (d) A multifunctional glove capable of bending sensing, sliding event detecting, as well as haptic stimulation for augmented AR/VR experiences. Reprinted with permission from Reference [132], Copyright 2020, AAAS. (e) A joint motion TENG quantization sensor enabled glove for the robotic collaborative operation. Reprinted with permission from Reference [133], Copyright 2018, Elsevier. (f) An electronic skin integrating triboelectric and piezoresistive sensing mechanisms for grasping tactile perception. Reprinted with permission from Reference [94], Copyright 2017, Elsevier. (g) A multifunctional fingertip tactile sensor capable of pressure sensing, temperature perception and material identification. Reprinted with permission from Reference [134], Copyright 2020, AAAS. (h) Nanophotonic modulator enabled readout strategy for TENG-based continuous pressure sensing. Reprinted with permission from Reference [135], Copyright 2020, American Chemical Society.
Nanoenergyadv 01 00005 g002
Figure 3. Other wearable HMIs. (a) A TENG-based micromotion sensor for eye blink motion monitoring. Reprinted with permission from Reference [158], Copyright 2017, AAAS. (b) A non-attached electrode-dielectric TENG sensor for eye blinking sensing. Reprinted with permission from Reference [159], Copyright 2020, Elsevier. (c) A bionic TENG-based sensor for masseter muscle motion monitoring. Reprinted with permission from Reference [161], Copyright 2021, Wiley. (d) A skin-attachable TENG microphone with high transparency and adhesion. Reprinted with permission from Reference [169], Copyright 2018, AAAS. (e) A wearable two-dimensional TENG touchpad for robotic arm manipulation. Reprinted with permission from Reference [170], Copyright 2018, American Chemical Society. (f) A bioinspired spider-net-coding (BISNC) TENG patch for multidirectional drone control. Reprinted with permission from Reference [171], Copyright 2019, Wiley. (g) A minimalistic exoskeleton enabled by triboelectric bidirectional sensors for upper limbs’ joint motion monitoring. Reprinted with permission from Reference [172], Copyright 2021, Springer Nature. (h) A badge-reel-like TENG stretch sensor for spinal information collection. Reprinted with permission from Reference [173], Copyright 2021, Springer Nature.
Figure 3. Other wearable HMIs. (a) A TENG-based micromotion sensor for eye blink motion monitoring. Reprinted with permission from Reference [158], Copyright 2017, AAAS. (b) A non-attached electrode-dielectric TENG sensor for eye blinking sensing. Reprinted with permission from Reference [159], Copyright 2020, Elsevier. (c) A bionic TENG-based sensor for masseter muscle motion monitoring. Reprinted with permission from Reference [161], Copyright 2021, Wiley. (d) A skin-attachable TENG microphone with high transparency and adhesion. Reprinted with permission from Reference [169], Copyright 2018, AAAS. (e) A wearable two-dimensional TENG touchpad for robotic arm manipulation. Reprinted with permission from Reference [170], Copyright 2018, American Chemical Society. (f) A bioinspired spider-net-coding (BISNC) TENG patch for multidirectional drone control. Reprinted with permission from Reference [171], Copyright 2019, Wiley. (g) A minimalistic exoskeleton enabled by triboelectric bidirectional sensors for upper limbs’ joint motion monitoring. Reprinted with permission from Reference [172], Copyright 2021, Springer Nature. (h) A badge-reel-like TENG stretch sensor for spinal information collection. Reprinted with permission from Reference [173], Copyright 2021, Springer Nature.
Nanoenergyadv 01 00005 g003
Figure 4. Robotic-related HMIs. (a) A TENG tactile sensor with interlocking microstructures for touch pressure perception. Reprinted with permission from Reference [207], Copyright 2019, Wiley. (b) A TENG-based auditory system for social robotics. Reprinted with permission from Reference [210], Copyright 2018, AAAS. (c) A TENG angle sensor for high-resolution angular monitoring of the robotic arm. Reprinted with permission from Reference [213], Copyright 2020, Wiley. (d) A flexible/stretchable TENG skin for soft robot tactile perception. Reprinted with permission from Reference [224], Copyright 2018, Wiley. (e) An intelligent soft gripper enabled by TENG-based tactile and bending sensor for grasped object recognition. Reprinted with permission from Reference [225], Copyright 2020, Springer Nature. (f) A triboelectric-photonic hybridized smart skin for robot tactile and gesture sensing. Reprinted with permission from Reference [226], Copyright 2018, Wiley. (g) A self-powered potentiometric–triboelectric hybridized mechanoreceptor for soft robot tactile sensing. Reprinted with permission from Reference [227], Copyright 2020, Wiley. (h) A quadruped robot equipped with TENG-enabled biomimetic whisker mechanoreceptors for exploration applications. Reprinted with permission from Reference [228], Copyright 2021, Wiley.
Figure 4. Robotic-related HMIs. (a) A TENG tactile sensor with interlocking microstructures for touch pressure perception. Reprinted with permission from Reference [207], Copyright 2019, Wiley. (b) A TENG-based auditory system for social robotics. Reprinted with permission from Reference [210], Copyright 2018, AAAS. (c) A TENG angle sensor for high-resolution angular monitoring of the robotic arm. Reprinted with permission from Reference [213], Copyright 2020, Wiley. (d) A flexible/stretchable TENG skin for soft robot tactile perception. Reprinted with permission from Reference [224], Copyright 2018, Wiley. (e) An intelligent soft gripper enabled by TENG-based tactile and bending sensor for grasped object recognition. Reprinted with permission from Reference [225], Copyright 2020, Springer Nature. (f) A triboelectric-photonic hybridized smart skin for robot tactile and gesture sensing. Reprinted with permission from Reference [226], Copyright 2018, Wiley. (g) A self-powered potentiometric–triboelectric hybridized mechanoreceptor for soft robot tactile sensing. Reprinted with permission from Reference [227], Copyright 2020, Wiley. (h) A quadruped robot equipped with TENG-enabled biomimetic whisker mechanoreceptors for exploration applications. Reprinted with permission from Reference [228], Copyright 2021, Wiley.
Nanoenergyadv 01 00005 g004
Figure 5. HMIs for smart home applications. (a) A subdivision-structural 3D touchpad enabled authorization system. Reprinted with permission from Reference [248], Copyright 2020, Elsevier. (b) A sliding-mode TENG-based control disk interface. Reprinted with permission from Reference [249], Copyright 2020, Elsevier. (c) A triboelectric-based transparent secret code. Reprinted with permission from Reference [250], Copyright 2018, Wiley. (d) A double-sided information card with a reference barcode component for reliability improvement. Reprinted with permission from Reference [251], Copyright 2017, Elsevier. (e) A large-scale and washable TENG textile enabled bedsheet for sleep behavior monitoring. Reprinted with permission from Reference [252], Copyright 2017, Wiley. (f) A self-powered identity recognition carpet system using TENG-based e-textile for safeguarding entrance. Reprinted with permission from Reference [253], Copyright 2020, Springer Nature. (g) TENG enabled smart mats as a scalable floor monitoring system. Reprinted with permission from Reference [254], Copyright 2020, Springer Nature.
Figure 5. HMIs for smart home applications. (a) A subdivision-structural 3D touchpad enabled authorization system. Reprinted with permission from Reference [248], Copyright 2020, Elsevier. (b) A sliding-mode TENG-based control disk interface. Reprinted with permission from Reference [249], Copyright 2020, Elsevier. (c) A triboelectric-based transparent secret code. Reprinted with permission from Reference [250], Copyright 2018, Wiley. (d) A double-sided information card with a reference barcode component for reliability improvement. Reprinted with permission from Reference [251], Copyright 2017, Elsevier. (e) A large-scale and washable TENG textile enabled bedsheet for sleep behavior monitoring. Reprinted with permission from Reference [252], Copyright 2017, Wiley. (f) A self-powered identity recognition carpet system using TENG-based e-textile for safeguarding entrance. Reprinted with permission from Reference [253], Copyright 2020, Springer Nature. (g) TENG enabled smart mats as a scalable floor monitoring system. Reprinted with permission from Reference [254], Copyright 2020, Springer Nature.
Nanoenergyadv 01 00005 g005
Figure 6. ML-enabled advanced HMIs. (a) A TENG-based smart keyboard for keystroke dynamics monitoring. Reprinted with permission from Reference [271], Copyright 2018, Elsevier. (b) Deep-learning-enabled TENG socks for gait analysis. Reprinted with permission from Reference [272], Copyright 2020, Springer Nature. (c) A deep-learning-enabled TENG glove for sign language translating. Reprinted with permission from Reference [273], Copyright 2021, Springer Nature. (d) A TENG-enhanced smart soft robotic manipulator for AIoT virtual shop applications. Reprinted with permission from Reference [274], Copyright 2021, Wiley. (e) A bioinspired deep-learning-based data fusion architecture integrating the vision data and somatosensory data for high-accuracy gesture recognition. Reprinted with permission from Reference [275], Copyright 2020, Springer Nature.
Figure 6. ML-enabled advanced HMIs. (a) A TENG-based smart keyboard for keystroke dynamics monitoring. Reprinted with permission from Reference [271], Copyright 2018, Elsevier. (b) Deep-learning-enabled TENG socks for gait analysis. Reprinted with permission from Reference [272], Copyright 2020, Springer Nature. (c) A deep-learning-enabled TENG glove for sign language translating. Reprinted with permission from Reference [273], Copyright 2021, Springer Nature. (d) A TENG-enhanced smart soft robotic manipulator for AIoT virtual shop applications. Reprinted with permission from Reference [274], Copyright 2021, Wiley. (e) A bioinspired deep-learning-based data fusion architecture integrating the vision data and somatosensory data for high-accuracy gesture recognition. Reprinted with permission from Reference [275], Copyright 2020, Springer Nature.
Nanoenergyadv 01 00005 g006
Figure 7. Haptic-feedback enhanced HMIs. (a) Soft wearable exo-glove using a tendon driven feedback for assisting people with disabilities. Reprinted with permission from Reference [299], Copyright 2019, Mary Ann Liebert. (b) skin-integrated electromagnetic vibrator based wireless haptic interface for VR and remote interactions. Reprinted with permission from Reference [300], Copyright 2019, Springer Nature. (c) Mimicking the muscle motions via hydraulically amplified self-healing electrostatic actuator. Reprinted with permission from Reference [301], Copyright 2019, Wiley. (d) Low-voltage dielectric elastomer actuator (DEA) based fingertip haptic feedback can provide feel-through stimulation. Reprinted with permission from Reference [302], Copyright 2020, Wiley. (e) Pyramid microstructured DEA for vibrational stimulus under AC voltage. Reprinted with permission from Reference [303], Copyright 2018, AIP Publishing. (f) Near-infrared (NIR) light induced thermoelastic deformation for programmable vibrotactile feedback system. Reprinted with permission from Reference [304], Copyright 2021, American Chemical Society. (g) A glove with high force density electrostatic clutch for VR feedback. Reprinted with permission from Reference [305], Copyright 2019, Wiley. (h) A multimode electrostatic actuator with hydraulically amplified haptic feedback for creating tactile stimulus. Reprinted with permission from Reference [306], Copyright 2020, Wiley. (i) A large reconfigurable pneumatic haptic array made by shape memory polymer activated by heating. Reprinted with permission from Reference [307], Copyright 2017, Wiley. (j) Refreshable braille display system based on pneumatic actuation and TENG based DEA. Reprinted with permission from Reference [308], Copyright 2020, Wiley. (k) Electrical discharge based feedback system using TENG array with ball electrode. Reprinted with permission from Reference [309], Copyright 2020, AAAS. (l) A multi-modal sensing and feedback glove with liquid metal based resistive strain sensor, vibrator, and thermal feedback units. Reprinted with permission from Reference [310], Copyright 2020, Wiley. (m) Skin-like thermo-haptic device with thermoelectric units using Peltier effect. Reprinted with permission from Reference [311], Copyright 2020, Wiley.
Figure 7. Haptic-feedback enhanced HMIs. (a) Soft wearable exo-glove using a tendon driven feedback for assisting people with disabilities. Reprinted with permission from Reference [299], Copyright 2019, Mary Ann Liebert. (b) skin-integrated electromagnetic vibrator based wireless haptic interface for VR and remote interactions. Reprinted with permission from Reference [300], Copyright 2019, Springer Nature. (c) Mimicking the muscle motions via hydraulically amplified self-healing electrostatic actuator. Reprinted with permission from Reference [301], Copyright 2019, Wiley. (d) Low-voltage dielectric elastomer actuator (DEA) based fingertip haptic feedback can provide feel-through stimulation. Reprinted with permission from Reference [302], Copyright 2020, Wiley. (e) Pyramid microstructured DEA for vibrational stimulus under AC voltage. Reprinted with permission from Reference [303], Copyright 2018, AIP Publishing. (f) Near-infrared (NIR) light induced thermoelastic deformation for programmable vibrotactile feedback system. Reprinted with permission from Reference [304], Copyright 2021, American Chemical Society. (g) A glove with high force density electrostatic clutch for VR feedback. Reprinted with permission from Reference [305], Copyright 2019, Wiley. (h) A multimode electrostatic actuator with hydraulically amplified haptic feedback for creating tactile stimulus. Reprinted with permission from Reference [306], Copyright 2020, Wiley. (i) A large reconfigurable pneumatic haptic array made by shape memory polymer activated by heating. Reprinted with permission from Reference [307], Copyright 2017, Wiley. (j) Refreshable braille display system based on pneumatic actuation and TENG based DEA. Reprinted with permission from Reference [308], Copyright 2020, Wiley. (k) Electrical discharge based feedback system using TENG array with ball electrode. Reprinted with permission from Reference [309], Copyright 2020, AAAS. (l) A multi-modal sensing and feedback glove with liquid metal based resistive strain sensor, vibrator, and thermal feedback units. Reprinted with permission from Reference [310], Copyright 2020, Wiley. (m) Skin-like thermo-haptic device with thermoelectric units using Peltier effect. Reprinted with permission from Reference [311], Copyright 2020, Wiley.
Nanoenergyadv 01 00005 g007
Figure 8. Self-sustainable/zero-power/passive HMIs. (a) A sustainable intelligent walking stick for real-time monitoring of the user’s location and well-being status in outdoor environments. Reprinted with permission from Reference [320], Copyright 2021, American Chemical Society. (b) A wireless tactile patch enabled by TENG’s direct coupling. Reprinted with permission from Reference [321], Copyright 2017, Wiley. (c) A zero-power TENG wireless network via the mechanical switch enabled frequency boosting up strategy. Reprinted with permission from Reference [322], Copyright 2019, Elsevier. (d) A passive wireless TENG sensor using a surface acoustic wave resonator (SAWR). Reprinted with permission from Reference [323], Copyright 2020, Elsevier.
Figure 8. Self-sustainable/zero-power/passive HMIs. (a) A sustainable intelligent walking stick for real-time monitoring of the user’s location and well-being status in outdoor environments. Reprinted with permission from Reference [320], Copyright 2021, American Chemical Society. (b) A wireless tactile patch enabled by TENG’s direct coupling. Reprinted with permission from Reference [321], Copyright 2017, Wiley. (c) A zero-power TENG wireless network via the mechanical switch enabled frequency boosting up strategy. Reprinted with permission from Reference [322], Copyright 2019, Elsevier. (d) A passive wireless TENG sensor using a surface acoustic wave resonator (SAWR). Reprinted with permission from Reference [323], Copyright 2020, Elsevier.
Nanoenergyadv 01 00005 g008
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sun, Z.; Zhu, M.; Lee, C. Progress in the Triboelectric Human–Machine Interfaces (HMIs)-Moving from Smart Gloves to AI/Haptic Enabled HMI in the 5G/IoT Era. Nanoenergy Adv. 2021, 1, 81-120. https://0-doi-org.brum.beds.ac.uk/10.3390/nanoenergyadv1010005

AMA Style

Sun Z, Zhu M, Lee C. Progress in the Triboelectric Human–Machine Interfaces (HMIs)-Moving from Smart Gloves to AI/Haptic Enabled HMI in the 5G/IoT Era. Nanoenergy Advances. 2021; 1(1):81-120. https://0-doi-org.brum.beds.ac.uk/10.3390/nanoenergyadv1010005

Chicago/Turabian Style

Sun, Zhongda, Minglu Zhu, and Chengkuo Lee. 2021. "Progress in the Triboelectric Human–Machine Interfaces (HMIs)-Moving from Smart Gloves to AI/Haptic Enabled HMI in the 5G/IoT Era" Nanoenergy Advances 1, no. 1: 81-120. https://0-doi-org.brum.beds.ac.uk/10.3390/nanoenergyadv1010005

Article Metrics

Back to TopTop