Next Article in Journal
The Effect of All-Terrain Vehicle Crash Location on Emergency Medical Services Time Intervals
Next Article in Special Issue
On Driver Behavior Recognition for Increased Safety: A Roadmap
Previous Article in Journal
ATD Biodynamics During Lateral Impact for USAF Neck Injury Criteria
Previous Article in Special Issue
Comparisons of Predictive Power for Traffic Accident Involvement; Celeration Behaviour versus Age, Sex, Ethnic Origin, and Experience
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review on Measuring Affect with Practical Sensors to Monitor Driver Behavior

1
Electrical and Computer Engineering Department, University of Louisville, Louisville, KY 40292, USA
2
Department of Psychology, George Mason University, Fairfax, VA 22030, USA
*
Authors to whom correspondence should be addressed.
Submission received: 29 August 2019 / Revised: 21 September 2019 / Accepted: 16 October 2019 / Published: 24 October 2019
(This article belongs to the Special Issue Traffic Safety and Driver Behaviour)

Abstract

:
Using sensors to monitor signals produced by drivers is a way to help better understand how emotions contribute to unsafe driving habits. The need for intuitive machines that can interpret intentional and unintentional signals is imperative for our modern world. However, in complex human–machine work environments, many sensors will not work due to compatibility issues, noise, or practical constraints. This review focuses on practical sensors that have the potential to provide reliable monitoring and meaningful feedback to vehicle operators—such as drivers, train operators, pilots, astronauts—as well as being feasible for implementation and integration with existing work infrastructure. Such an affect-sensitive intelligent vehicle might sound an alarm if signals indicate the driver has become angry or stressed, take control of the vehicle if needed, and collaborate with other vehicles to build a stress map that improves roadway safety. Toward such vehicles, this paper provides a review of emerging sensor technologies for driver monitoring. In our research, we look at sensors used in affect detection. This insight is especially helpful for anyone challenged with accurately understanding affective information, like the autistic population. This paper also includes material on sensors and feedback for drivers from populations that may have special needs.

1. Introduction

Emotions and affective expressions play a critical role in decision-making, learning, and other cognitive functions, but current technology is, for the most part, incapable of taking our emotions into account. Affective computing, supported through practical sensors, provides a possible solution to this problem. Taking driving as a context, by monitoring and reacting to the emotions or underlying signals from drivers, affective computing enhances interactions between humans and technology, with the ultimate goal to improve safety. The vehicle can be equipped with an intelligent support system to monitor the driver, monitor the driver’s behaviors, provide feedback to the driver, and even take control of the vehicle if necessary.
For long-term use and adoption, sensors should be practical: they should not require the user to spend significant time activating the device, experience discomfort when using the device, or spend significant time maintaining the device. These non-intrusive sensors can take the form and function of skin-contact wearables that measure unintentional signals as well as surface-borne sensors that collect intentional signals from the driver. An example of a practical sensor might be a group of pressure sensors embedded into a vehicle’s surfaces, powered and monitored by the vehicle to detect a user’s interactions with seats, safety accessories, armrests and the steering wheel. In-vehicle sensors already face challenges because of acoustic noise, electromagnetic noise and compatibility issues related to integrating with a central processor; for instance, a scalp-mounted electroencephalograph (EEG) sensor would be challenging to apply in vehicles because of noise sensitivity and data-processing requirements. Beyond engineering considerations, it would be unrealistic to expect daily drivers to apply the EEG sensor before each trip because it would be a new behavior and a time-consuming departure from the driver’s routine in a world where it is already difficult to get people to wear seatbelts. Wearable sensing devices such as watches and eyeglasses that fit into a driver’s established routine are more practical. Video cameras, reviewed in 2016 by Fernández et al. [1], are practical in the sense that users do not need to activate or touch them, but cameras and microphones also introduce privacy concerns and produce high bandwidth data that requires processing. New soft and textile-embedded sensor formats are promising because they can be fitted to vehicle interiors, and because they can detect safety-relevant activity using body contact data that is not as personally identifiable as video and audio streams.
Figure 1 illustrates a pressure sensor for tracking a driver’s grip pressure in (a) a body-worn format, and (b) a vehicle surface format. Wearable sensors such as the glove in (a) are more practical for daily use than, for example, blood sampling to measure glucose [2] or cortisol levels, or neural implants to detect brain activity in animal studies. Such invasive sampling can validate conclusions drawn from proxy signals available at the body surface, but a sensor glove is better for daily wear from the user’s viewpoint. However, for this grip-tracking application, the driver would have to modify their behavior to put gloves on, would need to keep the gloves charged, and would need to initiate wireless communication between the gloves and the vehicle. For those reasons, the steering wheel format in Figure 1b is more practical than gloves for grip tracking. Measurements of other signals that vary with driver stress levels, such as pulse rate, skin surface temperature, and skin conductance, often rely on skin-to-sensor contact that cannot be guaranteed on vehicle surfaces—even a steering wheel, if the driver is wearing mittens. In this physiological-sensing realm, wearables are unparallelled. Previous reviews have carefully considered wearable sensors for driving safety [3], wearable sensors for emotion recognition [4,5], and combinations of wearable and in-car sensors for detecting driver drowsiness [6,7] and distraction [7,8].
The current review covers the recent (since 2000) state of the literature on sensors that monitor driving behaviors, including emotions experienced while driving, and sensors designed for non-driving contexts that can detect emotional and physiological states applicable to transportation safety. We distinguish wearable sensors from vehicle surface-borne sensors, and consider where each of these sensor types may find the most practical application for monitoring driver behavior, identifying a general trend of wearable sensors for physiological measurements and surface-borne sensors for driver–vehicle interactions. We also provide a summary of emerging sensor technologies to study affective states, discuss concerns for their practicality in a driving situation, and their potential to contribute to future research on driver safety.

1.1. Affective States and Affect Detection

We provide a brief description of the literature’s use of affective states and emotions in the context of affect detection, as well as recommendations for further reading in related research areas. The foundation of affective computing is informed by theory of emotion. This field of research seeks to develop “computational systems that recognize and respond to the affective states (e.g., moods and emotions) of the user” as described by Calvo and D’Mello in their 2010 article [9], which gives a comprehensive review on the overlap of emotion research and affect detection. There is a rich history on the definition of emotion, emotional expression, and emotional experience [10,11]. Picard’s work [12] instituting affective computing steers clear of addressing a definition of emotion directly, instead defining emotional experience and moving on to affect detection. Picard (1995) uses “sentic state, emotional state, and affective state interchangeably. These refer to your dynamic state when you experience an emotion. All you consciously perceive in such a state is referred to as your emotional experience” [12]. Affect detection is possible by way of a person revealing their emotional expression, through the motor system, or “sentic modulation” [12]. James individually [13], and later with Lange [14], provides a theory of emotion that links physiological changes in the sympathetic nervous system (SNS), a part of the autonomic nervous system (ANS), to emotional expressions [13,14,15]. Physiological-based affect detection leverages sensors to detect changes in a person’s SNS and ANS. The James-Lange theory of emotion is used in several studies of affective states: Calvo and D’Mello [9]; Ekman, Levenson, and Friesen [16]; Critchley et al. [17]; AlZoubi, D’Mello, and Calvo [18]; and Baker et al. [19]. For further reading, see also work by Smith and Lazarus [20], Darwin [21], and Dalgleish [22].
For specific definitions of emotions, methods and assessment tools used in each study may vary. When given, we will summarize the definition of an affective state, or methods, used in a specific study. For Lazarus (1993), emotions include anger, anxiety, fright, sadness, and happiness, among others; and this research also describes an overlap of stress and emotion [10]. Russell (2003) defines core affect to include a pleasure scale (happy, sad) as well as an arousal scale (fatigue, drowsiness, tense, alertness) [11]. The affective states of frustration, confusion, engaged concentration, delight, surprise, boredom, and neutral were examined by Baker et al. [19], and defined as follows:
Frustration was defined (for participants) as dissatisfaction or annoyance. Confusion was defined as a noticeable lack of understanding, whereas engaged concentration was a state of interest that results from involvement in an activity. Delight was defined as a high degree of satisfaction. Surprise was defined as wonder or amazement, especially from the unexpected. Boredom was defined as being weary or restless due to lack of interest. Participants were given the option of making a neutral judgment to indicate a lack of distinguishable affect. Neutral was defined as no apparent emotion or feeling.
In relation to driving, an exhaustive list of which emotions have the greatest influence is not fully known, as discussed in Section 6. To frame that exploration, Figure 2 takes Russell’s Affective Circumplex [23] and marks areas likely, but not yet fully studied, to represent concerns for driver safety; we list some current studies linking affect and safe/unsafe driving behaviors in Table 1. Figure 2 should be considered a broad initial guideline for researchers considering the impact of emotions on driver behavior, but it should be used very cautiously as a definitive conclusion on which emotions are involved in driving and how they influence behavior. Therefore, the circumplex can be used to guide which sensors are appropriate to consider when monitoring drivers, based on the sensor’s history of studies relating it to certain affective information.

1.2. Methods

This review used Google Scholar to identify recent (since 2000) literature on driving-relevant affect detection using sensors that are compatible with vehicle environments. Keywords searched include driver behavior, driving safety, sensors, soft sensors, wearable sensors, and affect detection. The authors excluded results based on relevance to this review’s focus and redundancy of a topic covered by cited works. During the editorial process, reviewers suggested additional references. Ultimately, more than 110 references are cited for further reading.

2. Previous Work Relating Affect to Driver Behaviors/Physiological Signals

Endowing intelligent systems with an ability to understand implicit interaction cues, such as the person’s intention, attitude, and their likes and dislikes creates more meaningful and natural interactions between the human operator and the intelligent system [24]. However, current technology cannot seamlessly interpret emotions and affective states that convey implicit communication. Responding to emotions is integral to typical social interaction and increases ways humans and technology communicate. Additionally, human–machine interaction (HMI) that relies solely on explicit commands ignores the potential gain of implicit communication, which can be significant as evidenced from experimental psychology [25]. Affective computing provides a possible solution to this problem. To establish affect-sensitive HMI, the role and potential of implicit communication is important [26]. By monitoring and reacting to the emotions or underlying signals from users, affective computing enhances interactions between humans and technology. Dr. Picard’s book [27] established a springboard for affective computing. Recent advancements in this research area have moved toward wearables and other practical sensors, leveraged machine-learning analysis techniques, and expanded the range of application areas.
Aside from trait, personality, and other personal factors [28,29,30,31], traffic and environmental situations that contain certain appraisal factors (e.g., whether another driver was accountable) can lead to a driver’s development and experience of emotions [32]. Several representative examples, although not exhaustive, are outlined in Table 1. Emotions and the accompanying attributions of traffic situations create a motivational tendency to show certain behaviors [33,34]. Such behaviors, if dangerous, may lead to negative consequences and compromise one’s own safety and the safety of other road users [29,35,36]. For example, angry drivers tend to drive faster, commit more traffic violations, display hostile gestures, honk more frequently, and underestimate risky situations, as evidenced in questionnaire, simulator, and naturalistic driving studies [35,37,38,39]. These behaviors are considered aggressive and unsafe to other vehicles. Furthermore, individuals who scored higher on the Driving Anger Expression Index are 2.5 times more likely to damage their vehicles in anger and twice as likely to crash than individuals who scored lower on the Driving Anger Expression Index [40]. Other work [41] revealed that drowsiness had the largest impact on increased crash rates, more than other inattention scenarios; while stress was linked to minor crashes [42].
In the following, a brief summary of the methods or definitions used to study the affective states listed in Table 1 is given. Roidl et al. [32] used the Driving Anger Scale to measure anger, the State Trait Anxiety Inventory to measure anxiety, and a modified Geneva Emotion Wheel to measure contempt and fright. Westerman and Haigney [43] examined the Driver Behaviour Inventory and the Driver Behaviour Questionnaire to study stress. Steinhauser [44] studied happiness, calmness, and anger during driving through a combination of (1) asking participants to self-select and re-live a previously-experienced life event related to each emotion and (2) by playing music related to each emotion, as validated by Jefferies et al. [45]. No further definitions of the affective states were given to participants. Philip et al. [46] used Grandjean’s definition of fatigue [47] as “a gradual and cumulative process associated with a disinclination towards effort, eventually resulting in reduced performance efficiency.” Lee et al. [48] collected physiological measures of drowsiness and measures on the Johns Drowsiness Scale.

2.1. Measuring Affect Based on Physiological Signals

Previous research has shown that physiological signals could classify affective states induced by on-road driving with 97% accuracy [49], with heart rate and skin conductivity having the highest correlations with driver stress. Physiological signals are not appropriate indicators of emotion for every application. Respiration has been linked to being indicative of emotional states [50]. It is a slowly-changing signal that does not provide information in enough time to prevent a driving-related accident [51], but it may provide insight into the relationship between driver emotional response and behavior. Tracking multiple physiological signals was judged as a favorable approach in previous research [52,53,54], and should be examined in work that seeks to predict and respond to physiology-based changes in emotion.

2.2. Measuring Affect to Improve Driver–Vehicle Interactions

Aside from physiological signals, researchers have been examining the degree to which affect-sensitive driver interfaces can be used to infer and support a driver’s affective state, safety, and comfort [55]. The causal association between emotion and performance has long been documented. Drivers who are stressed or angry are more likely to exhibit unsafe and dangerous behaviors and violations [29,56,57,58]. Since the driving task heavily involves integrating visual information and coordinating motor responses, researchers have been exploring the use of other senses for the monitoring of driver’s affective state. For example, a speech-based emotion recognition system with an adaptive noise cancellation technique that filters out ambient noise from driving has shown promise in classifying positive, neutral and negative emotions [59]. Nass and colleagues examined whether characteristics of a vehicle voice can influence driver’s affective state and driving performance [60], and the results showed that when the driver’s emotion matched vehicle voice emotion, drivers had fewer accidents, attended more to the road, and spoke more to the vehicle. A recent article emphasized the importance of using natural driver-car communication to understand a driver’s affective state and needs as well as to provide a human-like assistance system [61]. This approach has the advantages of being adaptive to various driving situations, drivers’ propensities and coping strategies, and the uncertainty of traffic behaviors [62]. Recent findings suggest that, in addition to matching vehicle voice to driver’s affect, vehicle voice showing empathy via a voice assistant led to the largest improvement of negative emotions and was also positively perceived by angry and sad drivers [63].

3. Soft and Wearable Sensor Technologies Applicable to Monitoring Driver Behavior

This section reviews wearable sensors and soft surface-borne sensors that can measure some aspect of a driver’s affective state and provide data that could be used in the future to study possible improvements in driver safety. As discussed in the introduction, embedded and wearable sensors are practical formats for in-vehicle sensing. We divided the review into two branches: sensors that monitor largely-involuntary physiological signals, and sensors that monitor driver–vehicle interaction.

3.1. Sensor Technologies for Affect Detection Based on Physiological Signals

Physiological signals generated from the human body include brain electrical activity (electroencephalography, EEG), skin temperature, heart rate and other aspects of the heart’s electrical activity (electrocardiogram, ECG/EKG), eye blink rate, blood flow and oxygenation (SpO2), muscle current (electromyography, EMG), skin conductance changes due to sweating (galvanic skin response, GSR, or electrodermal activity, EDA), and respiration rate and volume. Such signals are usually involuntary, except in the sense that muscle signals and respiration events can sometimes originate from intentional body motions or speech. Physiological signals have previously been investigated for emotion recognition [4,64]. In the latter study [64], GSR, skin temperature, and heart rate were collected with an armband wearable sensor that the authors suggested could work with drivers.
Driving-specific studies that use wearable physiological sensors to investigate a safety-relevant emotional response include wired GSR and heart rate sensors measuring stress in a street driving environment [65]. Even though vehicles introduce electronic and acoustic noise, a seated driver produced fewer motion artifacts in the GSR and heart rate data than in related studies on ambulatory subjects.
More recently, GSR, SpO2, respiration and ECG data were collected from drivers using wearable sensors with the goal of recognizing task difficulty-induced stress [66]. EMG sensors applied to subjects’ facial muscles detected facial expressions originating from anger in simulated driving tests [67]. Heart rate and skin conductance electrodes provided insight into stress in subjects taking a simulated driving test in a later study, with visible feedback on drivers’ stress levels provided by real-time data processing [68]. Like the Healey studies, these groups used stick-on ECG and EMG electrodes and other physiological sensors that attached directly to the body; data collection was wireless in newer reports. Such biomedical electrodes are useful for proof-of-concept studies and high-quality data collection for a fundamental understanding of the relationship between physiological signals and emotions experienced while driving.
However, armband [69,70] and eyeglass-based [71] sensors are more practical than ECG electrodes for widespread use, because they are fast to apply and may already be part of a driver’s everyday routine. Researchers recently studied driver drowsiness using the infrared proximity sensor built into Google Glass eyeglasses to measure blink rate using a thresholding algorithm [72] and determined that they were able to detect operator drowsiness.
Softer, stretchier electronic and optical materials have emerged over the past 10 years, making it possible to collect physiological data from textile-like surfaces and even from skin-contacting conformal sensors. A recent overview of this fast-moving research area [73] described applications in healthcare, consumer electronics, and robotics. Although no driving-specific sensors were mentioned, physiological signals commonly used to detect stress (for example, pulse rate) are measurable with soft materials, and new emotion-relevant applications like wearable sweat quantification and analysis [74] are now possible thanks to skin-conforming materials.
These sensors are lighter, more breathable and more comfortable than ECG electrodes, but most drivers are not yet accustomed to applying stickers or tape to their skin, plus powering the devices is still an early-stage technology that uses radiofrequency (RF) power transmission or thin-film batteries. Therefore, researchers are also moving physiological sensors to vehicle interior surfaces. In-vehicle sensors have successfully monitored drivers’ heart rates for detecting drowsiness [75], using electrically conductive fabric wrapped around the steering wheel. ECG electrodes on the steering wheel have also been studied for driver identification from biometric signatures [76,77]. Heart rate and respiratory rate sensors were embedded in vehicle seats based on piezoresistive textiles [78]. Soft, surface-embedded sensors measured physiological data in a vehicle seat in road tests [79] where, beyond detecting driver stress, the authors suggested the passive seat and steering wheel ECG could improve safety by detecting underlying heart conditions. A problem they addressed that is not present in skin-adhesive sensors was drivers’ failure to consistently grip the steering wheel; they proposed to fill data gaps with lower-resolution heart-rate data measured from redundant sensors in the vehicle seat. The missing hand itself could also indicate inattention, for example from texting or holding a cellphone. A recent study investigated the use of non-contact, capacitive coupled ECG embedded in the back support of a driver’s seat in a simulator to estimate driver’s fatigue [80]. Results indicated that there was good correlation between conventional ECG and cECG signals and that cECG signals had higher quality over time. Although this study only had male participants and used one type of clothing, it demonstrated feasibility for monitoring dynamics of heart rate variability using non-contact, more practical ECG methods. Table 2 compares the above-listed physiological studies.

3.2. Measuring User Activity Based on Driver–Vehicle Interaction

Besides involuntary physiological signals like those reviewed in Section 3.1, drivers interact with vehicle surfaces by gripping, tapping, leaning, and other hand or whole-body motions that give insight into their attention level and affective state. These body motions may be intentional, as in steering wheel motions made by a driver following a route, or unintentional, such as fidgeting. Safety is also improved if body position information helps plan airbag deployment during a collision.
Intentional motions for steering, braking, and acceleration are already collected by vehicle instrumentation, but body position is not. The following sensor technologies are able to capture body position and other driver-vehicle interactions based on proximity, pressure, and acceleration. Table 3 covers such emerging wearable and vehicle surface-borne sensor technologies for measuring driver–vehicle interactions. Previous studies in this category often focus on activity recognition rather than emotion recognition, and, likely because these wearable and soft sensor materials are an emerging field, many of the papers summarized below and listed in Table 3 emphasize the new sensor technology itself rather than applications such as monitoring transportation activities. However, some recent papers do apply wearable and surface-borne user interaction sensors to driving. Researchers used accelerometer-equipped smart watches to track hand motion [83], making the connection to driver monitoring by correlating acceleration and gyroscope readings with non-steering secondary task motions during road-driving tests. Another group developed soft piezoresistive fabric steering wheel sensors, not for heart rate measurement as described above, but for detecting grip pressure, location, and swiping gestures [84].
In contrast to resistive pressure sensors which require direct contact, capacitive sensing can detect changes to electric fields extending above and around electrode surfaces. This feature makes capacitive sensing a good match for driver-vehicle interactions like head or torso position where the driver is not contacting the surface at all times. Capacitive proximity sensing has been applied to vehicle seats for detecting driver posture and possible sudden braking [85], and researchers investigated its feasibility for measuring driver head position [86], which is an indicator of drowsiness and a critical input for active restraint systems during a crash.
Emerging soft technologies are already monitoring subtle body motions in non-driving contexts using skin-like wearable sensors. For example, a wearable capacitive sensor was demonstrated to detect restless leg motion [87]. Soft, deformable optical materials made it possible to measure shape changes in a leg-worn athletic tape caused by weight bearing [88], and hand motions in a glove [89] equipped with all-polymer strain sensors. Body motions like fidgeting and slouching are also visible in images; Fernández et al. [1] gives a comprehensive overview of camera-based sensors for detecting motions relevant to driver fatigue and inattention. Soft optical sensors might complement or replace some types of image sensing, while their optical readouts might simplify wiring in vehicle applications where signals must be transmitted across a gap between moving parts. A recent study proposed methodology to standardize the processing of camera-based sensing data while taking into account individual differences, randomness in driver behaviors, and driver head motion tracking [90]. Another recent wearable sensor used resistive textile sensors embedded in trousers to classify body postures [91]; such sensors can capture shifts in weight that are difficult to pick up on camera, and can collect information related to a wearer’s focus, participation and engagement without video recording. Surface-borne sensors are contrasted with obtrusive video cameras for sensing human activity in smart environments [92].In recent office- and home-environment studies, researchers integrated thin resistive and capacitive sensors into soft surfaces for human activity recognition. An electronic textile couch was equipped with capacitive presence sensors, with a focus on sleep apnea intervention [93]. In another smart furniture experiment, observers recorded engagement, laughter, speaking and listening behaviors in seated subjects and correlated them with data from chairs fitted with resistive pressure-sensing pads [94]. These sensor formats are compatible with upholstered interior vehicle surfaces. The experiments generally determine pressure location by sampling a large array of sensor electrodes. Electrical impedance tomography can collect touch-location information with only a few (typically 8) electrodes using a scanning approach. Touchpads were created by painting surfaces with conductive paint on which consecutive resistance measurements were collected at pairs of electrodes along the edge of the conductive surface [95]. Electrical impedance tomography in a wrist-wearable format has also been used to classify hand gestures [96].
The key advantage of vehicle surface-borne sensors is that they are transparent to the user. In contrast to wearable textile sensors or wearable electronics where the user is acutely aware of the sensors, surface-borne sensors may collect body position or hand grip data without disturbing the user. Only recently it has become practical to measure body motions from these unobtrusive sensors instead of video or human observation methods. The link between this kind of sensor data and drivers’ affective states needs further clarification from the human-computer interaction community and comparison with questionnaires and physiological data.

4. Relationship between Driver Behavior and Roadway Safety

Making practical use of affect-sensitive sensor data to improve safety is a layered problem with solutions at the technology layer, the behavior modification layer, and the policy layer. In a study that merged physiological data (ECG heart monitoring) and vehicle data (speed, acceleration, fuel consumption, and pedal position), researchers went beyond characterizing individual driver behavior, suggesting that locations where multiple drivers experienced stress could help “map potentially dangerous road segments and intersections” [97]. Such information on human factors can complement and help interpret speed and braking patterns already captured by road sensors and external surveillance cameras [98], and geometric road characteristics like curvature and elevation collected by mobile phone sensor data analysis with an eye toward adjusting speed limits on rural roads with horizontal curves [99]. These examples suggest that real-time crash prevention is not the only goal for in-vehicle and wearable driving sensors. For instance, patterns of human stress reactions to specific traffic conditions could effectively distill years of human driving experience into safer algorithms for self-driving cars. The benefits of these advanced sensors can also extend to pedestrians and passengers of vehicles without sensors in the form of road repairs, warning signs, and traffic re-routing.
A well-integrated monitoring and assistance system is likely to maximize the intended safety benefits while minimizing barriers to adoption. From a user’s perspective, having an intelligent system is only part of the solution, the other requirement is user’s acceptance, adoption, and cooperation. In their conceptual framework, Lee and See described the processes from receiving information on a display to calibrate trust in automation and to develop reliance on automation, and how this process is influenced by individual, organizational, cultural, and environmental contexts [100]. For affect-sensitive driver interfaces to efficiently monitor and support drivers, the sensors, technologies, interfaces, users, and the operating environment (vehicle itself and supportive infrastructure) should be designed jointly and as one whole system [101].

5. At-Risk Example Population: Drivers with Autism Spectrum Disorders

The literature shows that sensors used for affect detection do provide information relevant to monitoring driver behavior. However, most studies do not include drivers with autism. This population is covered as an at-risk example of drivers with special needs to consider. Specifically, individuals with autism have deficits in accurately expressing explicit cues of affect, making forms of affect detection that rely on generalized facial expressions or neurotypical vocal tones less reliable. Individuals with autism spectrum disorders (ASD), are not devoid of affective expressions [102,103,104], but their own understanding of them and interpretation by others is limited [105,106]. For example, a person with ASD might smile when actually in pain. Furthermore, facial feature interpretation can be challenging because of their difficulties in displaying the expected range of facial expressions [107,108] or mismatch with their vocal tone. Therefore, monitoring another communication signal can be very informative for detecting changes in emotions for this population. An affect-sensitive system that can interpret the changing emotions of a driver and react with useful and appropriate feedback could be transformative. This at-risk population may require more training hours than neurotypical individuals, before driving skills are acquired at a safe level. As a focus discussed in future directions, driving simulators will be targeted to develop the intervention infrastructure.
Autism rates are growing, and the challenges autism presents to daily life abound. Research suggests prevalence rates of autism have increased in the last four decades from 1 in 10,000 to an estimated 1 in 68 children and 1 in 42 boys, based on the latest CDC report [109]. Individuals with autism are characterized by having difficulties with social interaction and communication, and a tendency to fixate on limited interests and repetitive behaviors [110]. The symptoms can range in degree from mild to severe, which is why autism is a spectrum disorder and generally described as autism spectrum disorders, or ASD. Even though there is increasing research in technology-assisted autism intervention, there is a paucity of published studies that specifically address how to automatically detect and respond to affective states of individuals with ASD. Such ability could be critical given the importance of human affective information in human–technology interaction [27,111] and the significant impacts of the affective factors of children with ASD on the intervention practice [112,113,114].
People with autism do have changing physiological signals that indicate reactions to their experiences [102,103,104]. Detecting subtle markers of changes in emotions is important in autistic therapies. Trained therapists make their best interpretations but could be further assisted by advancements in affective computing. Previous work demonstrated that affect-sensitive closed-loop human–robot interaction improved performance and enhanced enjoyment for a small group of children with ASD [115]. Advancements in sensors and interpretation of signals between drivers with ASD and technology are needed. An intelligent driving simulator that can detect the affective states of a person with ASD and interact with him/her based on such perception could have a wide range of potential impacts. A clinician could use the history of the person’s affective information to analyze the effects of the intervention approach. With the record of the activities and the consequent emotional changes in the person with ASD, a driver training system could learn individual preferences and affective characteristics over time and thus could alter the manner in which it responds to the needs of different drivers with ASD.

6. Conclusions and Future Directions

For a full understanding of driver behavior and its relationship to safety, sensors must capture both unintentional physiological signals correlated with fatigue/stress/affective states, and voluntary interaction signals (for example, steering, braking, gripping) coming from the driver’s response to those states. The general pattern that emerged from our literature review in Section 3 is that: wearable, skin-contacting sensors are a practical means for successfully capturing unintentional physiological signals. Surface-borne sensors are more practical than wearables, as discussed in the glove-vs.-wheel example of Figure 1, but are more difficult to use for physiological sensing than for user activity recognition because unreliable skin contact adds noise to most physiological signals. Perhaps for this reason, wearables dominated the physiological sensors reviewed in Table 2, which was limited to driving applications only. Meanwhile, driver–vehicle interaction sensing can be successful with either a wearable or surface-borne approach. Table 3, our review of sensors that capture user-interaction signals, had a relatively even split between wearable and surface-borne sensors for detecting driver–vehicle interaction. Since these studies were so recent, we did not narrow the applications to a driving context. As these emerging user interaction sensor technologies mature, the practical advantages of in-vehicle surface sensors may give them an edge over wearables.
Figure 3 puts our review of affect detection (Table 1), physiological sensors (Table 2), and user-interaction sensors (Table 3) in context with the larger picture of driver behavior. A third category of sensors, vehicle data sensors, refers to braking, steering, acceleration and other mechanical signals available from vehicle computers. This sensor layer in the second row of the diagram is the link between driver behavior and possible safety interventions.
This review suggests utilizing implicit communication by analyzing affective information gathered from physiological data of a person during affect-sensitive interactions with an intelligent system. The intelligent system, such as an advanced driver training system, will take in processed physiological signals and apply an affective model which maps the signals to an affective state. Then that system will make decisions about altering the interaction to respond appropriately to the affective state. The intelligent system is trying to emulate the human ability to detect, interpret, and influence affective states. Although such systems will not be able to precisely define a user’s internal motivations, the information can be used as feedback to improve HMI and skill learning. Closed-loop interaction is achievable, after open-loop analysis to process the signals into samples of features and build affective models to relate feature samples to an affective state. The current climate of high acceptance of wearable electronics in daily life, data-driven solutions, and demand for more communication between humans and machines is ripe for advancements in affective computing. Teenagers may be willing adopters of such technology and could be the first generation to witness the future fruits of affective computing experiments, implemented on common computing devices during closed-loop interactions in everyday life.
Conducting a comprehensive study of physiological signal analysis during driving situations, with an open-ended broad list of emotions would be a useful next step. Such a study could be modeled after AlZoubi, D’Mello, and Calvo’s work on the exploration of computer-based learning situations [18]. This previous work collected data on which emotions are likely to occur in a learning activity. Participants were 27 adults that completed a learning module on a computer. Participants then watched their 45-min session again and gave labels to every 20 s of the experience. The affective states they could choose from included: boredom, confusion, curiosity, delight, flow/engagement, frustration, surprise, neutral (no affect), and an “other” category. This research provided important information on which emotions are more likely to be experienced in a learning situation. A similar study centered on a driving task would be of great benefit to quantify which emotions are more prevalent while driving. These insights can then guide the deployment of sensors and the integration of unintentional and intentional signals that will support driver monitoring, assistance, and intervention.
Additionally, future research should systematically compare the feasibility and efficacy of emerging surface-borne and other practical sensors in a driving context and investigate the potential for monitoring a driver’s affective state and implications for training and interventions. This work needs to be conducted in simulators as well as naturalistically with a focus on improving safety and well-being of the drivers. Practical applications should be envisioned beyond real-time intervention in individual vehicles. Large-scale statistics on drivers’ affective states in response to common driving situations could offer valuable training insights for driverless vehicle algorithms. Mapping stress and distraction could suggest better design rules for future roadways. Affective mapping might also pinpoint where to spend transportation funds on roadway modifications that improve safety not only for individual sensor-equipped vehicles, but also for cyclists, pedestrians, and drivers with autism who do not outwardly express affective states in the same manner as the majority of drivers. As we pointed out earlier, clinically-disadvantaged populations, such as individuals with autism spectrum disorders [52,104,108], can especially benefit from tailored in-vehicle intelligent systems that monitor vehicle control behaviors and the underlying physiological states. Practical wearable and surface-borne sensors coupled with already available vehicle data sensors provide the means to connect affect detection to driver, vehicle, and transportation system interventions.

Author Contributions

Writing—original draft, K.C.W.; C.H. and Y.-C.L.

Funding

This research was partially supported by the National Science Foundation award #1653624. Publication of this article was funded in part by the George Mason University Libraries Open Access Publishing Fund.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Fernández, A.; Usamentiaga, R.; Carús, J.L.; Casado, R. Driver Distraction Using Visual-Based Sensors and Algorithms. Sensors 2016, 16, 1805. [Google Scholar] [CrossRef]
  2. Merickel, J.; High, R.; Smith, L.; Wichman, C.; Frankel, E.; Smits, K.; Drincic, A.; Desouza, C.; Gunaratne, P.; Ebe, K.; et al. Driving Safety and Real-Time Glucose Monitoring in Insulin-Dependent Diabetes. Int. J. Automot. Eng. 2019, 10, 34–40. [Google Scholar] [Green Version]
  3. Liu, L.; Karatas, C.; Li, H.; Tan, S.; Gruteser, M.; Yang, J.; Chen, Y.; Martin, R.P. Toward Detection of Unsafe Driving with Wearables. In Proceedings of the 2015 Workshop on Wearable Systems and Applications, Florence, Italy, 19–22 May 2015; ACM: New York, NY, USA, 2015; pp. 27–32. [Google Scholar]
  4. Lisetti, C.L.; Nasoz, F. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals. Eurasip J. Appl. Signal Process. 2004, 1672–1687. [Google Scholar] [CrossRef]
  5. Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef]
  6. Sahayadhas, A.; Sundaraj, K.; Murugappan, M. Detecting driver drowsiness based on sensors: A review. Sensors 2012, 12, 16937–16953. [Google Scholar] [CrossRef]
  7. Kang, H.-B. Various approaches for driver and driving behavior monitoring: A review. In Proceedings of the 2013 IEEE International Conference on Computer Vision Workshops, Sydney, Australia, 1–8 December 2013; pp. 616–623. [Google Scholar]
  8. Lechner, G.; Fellmann, M.; Festl, A.; Kaiser, C.; Kalayci, T.E.; Spitzer, M.; Stocker, A. A Lightweight Framework for Multi-device Integration and Multi-sensor Fusion to Explore Driver Distraction. In Proceedings of the Advanced Information Systems Engineering, Rome, Italy, 3–7 June 2019; pp. 80–95. [Google Scholar]
  9. Calvo, R.A.; D’Mello, S. Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications. IEEE Trans. Affect. Comput. 2010, 1, 18–37. [Google Scholar] [CrossRef]
  10. Lazarus, R.S. From psychological stress to the emotions: A history of changing outlooks. Annu. Rev. Psychol. 1993, 44, 1–21. [Google Scholar] [CrossRef] [PubMed]
  11. Russell, J.A. Core affect and the psychological construction of emotion. Psychol. Rev. 2003, 110, 145–172. [Google Scholar] [CrossRef] [PubMed]
  12. Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 1995; M.I.T Media Laboratory Perceptual Computing Section Technical Report No. 321; Available online: https://affect.media.mit.edu/pdfs/95.picard.pdf (accessed on 19 September 2019).
  13. James, W. What is an Emotion? Mind 1884, 9, 188–205. [Google Scholar] [CrossRef]
  14. Lange, C.G.; James, W. The Emotions; Williams & Wilkins: Philadelphia, PA, USA, 1922. [Google Scholar]
  15. Lang, P.J. The varieties of emotional experience: A meditation on James-Lange theory. Psychol. Rev. 1994, 101, 211–221. [Google Scholar] [CrossRef] [PubMed]
  16. Ekman, P.; Levenson, R.W.; Friesen, W.V. Autonomic nervous system activity distinguishes among emotions. Science 1983, 221, 1208–1210. [Google Scholar] [CrossRef] [PubMed]
  17. Critchley, H.D.; Rotshtein, P.; Nagai, Y.; O’Doherty, J.; Mathias, C.J.; Dolan, R.J. Activity in the human brain predicting differential heart rate responses to emotional facial expressions. Neuroimage 2005, 24, 751–762. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. AlZoubi, O.; D’Mello, S.K.; Calvo, R.A. Detecting Naturalistic Expressions of Nonbasic Affect Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 298–310. [Google Scholar] [CrossRef]
  19. Baker, R.S.; D’Mello, S.K.; Rodrigo, M.M.T.; Graesser, A.C. Better to be frustrated than bored: The incidence, persistence, and impact of learners’ cognitive--affective states during interactions with three different computer-based learning environments. Int. J. Hum. Comput. Stud. 2010, 68, 223–241. [Google Scholar] [CrossRef]
  20. Smith, C.A.; Lazarus, R.S. Emotion and adaptation. Handb. Personal. Theory Res. 1990, 609–637. [Google Scholar]
  21. Darwin, C. The Expression of the Emotions in Man and Animals; Oxford University Press: Oxford, UK, 1872. [Google Scholar]
  22. Dalgleish, T. The emotional brain. Nat. Rev. Neurosci. 2004, 5, 583–589. [Google Scholar] [CrossRef]
  23. Russell, J.A. A circumplex model of affect. J. Pers. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
  24. Reeves, B.; Nass, C.I. The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places; Cambridge University Press: Cambridge, UK, 1996. [Google Scholar]
  25. Bradley, M.M.; Lang, P.J. Measuring emotion: Behavior, feeling, and physiology. Cogn. Neurosci. Emot. 2000, 431, 242–276. [Google Scholar]
  26. Cowie, R. Emotion-Oriented Computing: State of the Art and Key Challenges. Humaine Network of Excellence 2005. Available online: https://pdfs.semanticscholar.org/a393/9cfe4757380cde173aba0e6b558ddb60884f.pdf (accessed on 27 August 2019).
  27. Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 2000; ISBN 9780262661157. [Google Scholar]
  28. Bogdan, S.R.; Măirean, C.; Havârneanu, C.-E. A meta-analysis of the association between anger and aggressive driving. Transp. Res. Part F Traffic Psychol. Behav. 2016, 42, 350–364. [Google Scholar] [CrossRef]
  29. Deffenbacher, J.L.; Lynch, R.S.; Filetti, L.B.; Dahlen, E.R.; Oetting, E.R. Anger, aggression, risky behavior, and crash-related outcomes in three groups of drivers. Behav. Res. Ther. 2003, 41, 333–349. [Google Scholar] [CrossRef]
  30. Lajunen, T.; Parker, D. Are aggressive people aggressive drivers? A study of the relationship between self-reported general aggressiveness, driver anger and aggressive driving. Accid. Anal. Prev. 2001, 33, 243–255. [Google Scholar] [CrossRef]
  31. Underwood, G. Traffic and Transport Psychology: Theory and Application; Elsevier: Amsterdam, The Netherlands, 2005; ISBN 9780080550794. [Google Scholar]
  32. Roidl, E.; Frehse, B.; Höger, R. Emotional states of drivers and the impact on speed, acceleration and traffic violations—A simulator study. Accid. Anal. Prev. 2014, 70, 282–292. [Google Scholar] [CrossRef] [PubMed]
  33. Kuppens, P.; Van Mechelen, I.; Smits, D.J.M.; De Boeck, P.; Ceulemans, E. Individual differences in patterns of appraisal and anger experience. Cogn. Emot. 2007, 21, 689–713. [Google Scholar] [CrossRef] [Green Version]
  34. Scherer, K.R.; Schorr, A.; Johnstone, T. Appraisal Processes in Emotion: Theory, Methods, Research; Oxford University Press: Oxford, UK, 2001; ISBN 9780195351545. [Google Scholar]
  35. Mesken, J.; Hagenzieker, M.P.; Rothengatter, T.; de Waard, D. Frequency, determinants, and consequences of different drivers’ emotions: An on-the-road study using self-reports, (observed) behaviour, and physiology. Transp. Res. Part F Traffic Psychol. Behav. 2007, 10, 458–475. [Google Scholar] [CrossRef]
  36. Shinar, D.; Compton, R. Aggressive driving: An observational study of driver, vehicle, and situational variables. Accid. Anal. Prev. 2004, 36, 429–437. [Google Scholar] [CrossRef]
  37. Li, S.; Zhang, T.; Sawyer, B.D.; Zhang, W.; Hancock, P.A. Angry Drivers Take Risky Decisions: Evidence from Neurophysiological Assessment. Int. J. Environ. Res. Public Health 2019, 16, 1701. [Google Scholar] [CrossRef]
  38. Precht, L.; Keinath, A.; Krems, J.F. Effects of driving anger on driver behavior--Results from naturalistic driving data. Transp. Res. Part F Traffic Psychol. Behav. 2017, 45, 75–92. [Google Scholar] [CrossRef]
  39. Zhang, T.; Chan, A.H.S. The association between driving anger and driving outcomes: A meta-analysis of evidence from the past twenty years. Accid. Anal. Prev. 2016, 90, 50–62. [Google Scholar] [CrossRef]
  40. Deffenbacher, J.L.; Stephens, A.N.; Sullman, M.J.M. Driving anger as a psychological construct: Twenty years of research using the Driving Anger Scale. Transp. Res. Part F Traffic Psychol. Behav. 2016, 42, 236–247. [Google Scholar] [CrossRef]
  41. Klauer, S.G.; Dingus, T.A.; Neale, V.L.; Sudweeks, J.D.; Ramsey, D.J. The Impact of Driver Inattention on Near-Crash/Crash Risk: An Analysis Using the 100-Car Naturalistic Driving Study Data; Report No. DOT HS 810 594; National Highway Traffic Safety Administration: Washington, DC, USA, 2006.
  42. Matthews, G.; Dorn, L.; Ian Glendon, A. Personality correlates of driver stress. Pers. Individ. Dif. 1991, 12, 535–549. [Google Scholar] [CrossRef]
  43. Westerman, S.J.; Haigney, D. Individual differences in driver stress, error and violation. Pers. Individ. Dif. 2000, 29, 981–998. [Google Scholar] [CrossRef]
  44. Steinhauser, K.; Leist, F.; Maier, K.; Michel, V.; Pärsch, N.; Rigley, P.; Wurm, F.; Steinhauser, M. Effects of emotions on driving behavior. Transp. Res. Part F Traffic Psychol. Behav. 2018, 59, 150–163. [Google Scholar] [CrossRef]
  45. Jefferies, L.N.; Smilek, D.; Eich, E.; Enns, J.T. Emotional valence and arousal interact in attentional control. Psychol. Sci. 2008, 19, 290–295. [Google Scholar] [CrossRef] [PubMed]
  46. Philip, P.; Sagaspe, P.; Moore, N.; Taillard, J.; Charles, A.; Guilleminault, C.; Bioulac, B. Fatigue, sleep restriction and driving performance. Accid. Anal. Prev. 2005, 37, 473–478. [Google Scholar] [CrossRef]
  47. Grandjean, E. Fatigue in industry. Br. J. Ind. Med. 1979, 36, 175–186. [Google Scholar] [CrossRef]
  48. Lee, M.L.; Howard, M.E.; Horrey, W.J.; Liang, Y.; Anderson, C.; Shreeve, M.S.; O’Brien, C.S.; Czeisler, C.A. High risk of near-crash driving events following night-shift work. Proc. Natl. Acad. Sci. USA 2016, 113, 176–181. [Google Scholar] [CrossRef]
  49. Healey, J.; Picard, R.W. Detecting stress during real-world driving tasks using physiological sensors. IEEE Trans. Intell. Transp. Syst. 2005, 6, 156–166. [Google Scholar] [CrossRef]
  50. Picard, R.W.; Vyzas, E.; Healey, J. Toward machine emotional intelligence: Analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 2001, 23, 1175–1191. [Google Scholar] [CrossRef]
  51. Itoh, K.; Miwa, H.; Nukariya, Y.; Zecca, M.; Takanobu, H.; Roccella, S.; Carrozza, M.C.; Dario, P.; Takanishi, A. Development of a Bioinstrumentation System in the Interaction between a Human and a Robot. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 2620–2625. [Google Scholar]
  52. Liu, C.; Conn, K.; Sarkar, N.; Stone, W. Physiology-based affect recognition for computer-assisted intervention of children with Autism Spectrum Disorder. Int. J. Hum. Comput. Stud. 2008, 66, 662–677. [Google Scholar] [CrossRef]
  53. Bethel, C.L.; Salomon, K.; Murphy, R.R.; Burke, J.L. Survey of Psychophysiology Measurements Applied to Human-Robot Interaction. In Proceedings of the RO-MAN 2007—The 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju, Korea, 26–29 August 2007; pp. 732–737. [Google Scholar]
  54. Kreibig, S.D. Autonomic nervous system activity in emotion: A review. Biol. Psychol. 2010, 84, 394–421. [Google Scholar] [CrossRef]
  55. Ho, C.; Spence, C. Affective multisensory driver interface design. Int. J. Veh. Noise Vib. 2013, 9, 61–74. [Google Scholar] [CrossRef]
  56. Cunningham, M.L.; Regan, M.A. The impact of emotion, life stress and mental health issues on driving performance and safety. Road Transp. Res. A J. Aust. New Zealand Res. Pract. 2016, 25, 40. [Google Scholar]
  57. Emo, A.K.; Matthews, G.; Funke, G.J. The slow and the furious: Anger, stress and risky passing in simulated traffic congestion. Transp. Res. Part F Traffic Psychol. Behav. 2016, 42, 1–14. [Google Scholar] [CrossRef]
  58. Maxwell, J.P.; Grant, S.; Lipkin, S. Further validation of the propensity for angry driving scale in British drivers. Pers. Individ. Dif. 2005, 38, 213–224. [Google Scholar] [CrossRef]
  59. Tawari, A.; Trivedi, M. Speech based emotion classification framework for driver assistance system. In Proceedings of the 2010 IEEE Intelligent Vehicles Symposium, La Jolla, CA, USA, 21–24 June 2010; pp. 174–178. [Google Scholar]
  60. Nass, C.; Jonsson, I.-M.; Harris, H.; Reaves, B.; Endo, J.; Brave, S.; Takayama, L. Improving Automotive Safety by Pairing Driver Emotion and Car Voice Emotion. In Proceedings of the CHI ’05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; ACM: New York, NY, USA, 2005; pp. 1973–1976. [Google Scholar]
  61. Eyben, F.; Wöllmer, M.; Poitschke, T.; Schuller, B.; Blaschke, C.; Färber, B.; Nguyen-Thien, N. Emotion on the Road—Necessity, Acceptance, and Feasibility of Affective Computing in the Car. Adv. Hum.-Comput. Interact. 2010, 2010. [Google Scholar] [CrossRef]
  62. Braun, M.; Pfleging, B.; Alt, F. A Survey to Understand Emotional Situations on the Road and What They Mean for Affective Automotive UIs. Multimodal Technol. Interact. 2018, 2, 75. [Google Scholar] [CrossRef]
  63. Braun, M.; Schubert, J.; Pfleging, B.; Alt, F. Improving Driver Emotions with Affective Strategies. Multimodal Technol. Interact. 2019, 3, 21. [Google Scholar] [CrossRef]
  64. Wioleta, S. Using physiological signals for emotion recognition. In Proceedings of the 2013 6th International Conference on Human System Interactions (HSI), Sopot, Poland, 6–8 June 2013; pp. 556–561. [Google Scholar]
  65. Healey, J.A. Wearable and Automotive Systems for Affect Recognition from Physiology. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2000. [Google Scholar]
  66. Ranjan Singh, R.; Banerjee, R. Multi-parametric analysis of sensory data collected from automotive drivers for building a safety-critical wearable computing system. In Proceedings of the 2010 2nd International Conference on Computer Engineering and Technology, Chengdu, China, 16–18 April 2010; Volume 1, pp. V1-355–V1-360. [Google Scholar]
  67. Saikalis, C.T.; Lee, Y.-C. An investigation of measuring driver anger with electromyography. In Proceedings of the 10th International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, Santa Fe, NM, USA, 24–27 June 2019. [Google Scholar]
  68. MacLean, D.; Roseway, A.; Czerwinski, M. MoodWings: A Wearable Biofeedback Device for Real-time Stress Intervention. In Proceedings of the 6th International Conference on PErvasive Technologies Related to Assistive Environments, Rhodes Island, Greece, 29–31 May 2013; ACM: New York, NY, USA, 2013; pp. 66:1–66:8. [Google Scholar]
  69. Nasoz, F.; Lisetti, C.L.; Vasilakos, A.V. Affectively intelligent and adaptive car interfaces. Inf. Sci. 2010, 180, 3817–3836. [Google Scholar] [CrossRef]
  70. Welch, K.C.; Kulkarni, A.S.; Jimenez, A.M. Wearable sensing devices for human-machine interaction systems. In Proceedings of the United States National Committee of URSI National Radio Science Meeting (USNC-URSI NRSM), Boulder, CO, USA, 4–7 January 2018; pp. 1–2. [Google Scholar]
  71. Saadatzi, M.N.; Tafazzoli, F.; Welch, K.C.; Graham, J.H. EmotiGO: Bluetooth-enabled eyewear for unobtrusive physiology-based emotion recognition. In Proceedings of the 2016 IEEE International Conference on Automation Science and Engineering (CASE), Fort Worth, TX, USA, 21–24 August 2016; pp. 903–909. [Google Scholar]
  72. He, J.; Choi, W.; Yang, Y.; Lu, J.; Wu, X.; Peng, K. Detection of driver drowsiness using wearable devices: A feasibility study of the proximity sensor. Appl. Ergon. 2017, 65, 473–480. [Google Scholar] [CrossRef]
  73. Yeo, J.C.; Lim, C.T. Emerging flexible and wearable physical sensing platforms for healthcare and biomedical applications. Microsyst. Nanoeng. 2016, 2, 16043. [Google Scholar]
  74. Huang, X.; Liu, Y.; Chen, K.; Shin, W.-J.; Lu, C.-J.; Kong, G.-W.; Patnaik, D.; Lee, S.-H.; Cortes, J.F.; Rogers, J.A. Stretchable, wireless sensors and functional substrates for epidermal characterization of sweat. Small 2014, 10, 3083–3090. [Google Scholar] [CrossRef] [PubMed]
  75. Yu, X. Real-Time Nonintrusive Detection of Driver Drowsiness; Technical Report CTS 09-15; University of Minnesota Center for Transportation Studies: Minneapolis, MA, USA, 2009. [Google Scholar]
  76. Lourenço, A.; Alves, A.P.; Carreiras, C.; Duarte, R.P.; Fred, A. CardioWheel: ECG Biometrics on the Steering Wheel. In Proceedings of the Joint European Conference on Machine Learning and Knowledge Discovery in Databases, Porto, Portugal, 7–11 September 2015; pp. 267–270. [Google Scholar]
  77. Pinto, J.R.; Cardoso, J.S.; Lourenço, A.; Carreiras, C. Towards a Continuous Biometric System Based on ECG Signals Acquired on the Steering Wheel. Sensors 2017, 17, 2228. [Google Scholar] [CrossRef] [PubMed]
  78. Granado, M.R.; Anta, L.M.G. Textile Piezoresistive Sensor and Heartbeat and/or Respiratory Rate Detection System. U.S. Patent 15/022,306, 29 September 2016. [Google Scholar]
  79. Baek, H.J.; Lee, H.B.; Kim, J.S.; Choi, J.M.; Kim, K.K.; Park, K.S. Nonintrusive biological signal monitoring in a car to evaluate a driver’s stress and health state. Telemed. J. e-Health 2009, 15, 182–189. [Google Scholar] [CrossRef]
  80. Bhardwaj, R.; Balasubramanian, V. Viability of Cardiac Parameters Measured Unobtrusively Using Capacitive Coupled Electrocardiography (cECG) to Estimate Driver Performance. IEEE Sens. J. 2019, 19, 4321–4330. [Google Scholar] [CrossRef]
  81. Oliveira, L.; Cardoso, J.S.; Lourenço, A.; Ahlström, C. Driver drowsiness detection: A comparison between intrusive and non-intrusive signal acquisition methods. In Proceedings of the 2018 7th European Workshop on Visual Information Processing (EUVIP), Tampere, Finland, 26–28 November 2018; pp. 1–6. [Google Scholar]
  82. Hassib, M.; Braun, M.; Pfleging, B.; Alt, F. Detecting and Influencing Driver Emotions using Psycho-physiological Sensors and Ambient Light. In Proceedings of the 17th IFIP TC. 13 International Conference on Human-Computer Interaction (INTERACT2019), Paphos, Cyprus, 2–6 September 2019. [Google Scholar]
  83. Bi, C.; Huang, J.; Xing, G.; Jiang, L.; Liu, X.; Chen, M. SafeWatch: A Wearable Hand Motion Tracking System for Improving Driving Safety. In Proceedings of the Second International Conference on Internet-of-Things Design and Implementation, Pittsburgh, PA, USA, 18–21 April 2017; ACM: New York, NY, USA, 2017; pp. 223–232. [Google Scholar]
  84. McMillen, K.A.; Lacy, C.; Allen, B.; Lobedan, K.; Wille, G. Sensor Systems Integrated with Steering Wheels. U.S. Patent 9,827,996, 28 November 2017. [Google Scholar]
  85. Durgam, K.K.; Sundaram, G.A. Behavior of 3-axis conformal proximity sensor arrays for restraint-free, in-vehicle, deployable safety assistance. J. Intell. Fuzzy Syst. 2019, 36, 2085–2094. [Google Scholar] [CrossRef]
  86. Ziraknejad, N.; Lawrence, P.D.; Romilly, D.P. Vehicle Occupant Head Position Quantification Using an Array of Capacitive Proximity Sensors. IEEE Trans. Veh. Technol. 2015, 64, 2274–2287. [Google Scholar] [CrossRef]
  87. Pouryazdan, A.; Prance, R.J.; Prance, H.; Roggen, D. Wearable Electric Potential Sensing: A New Modality Sensing Hair Touch and Restless Leg Movement. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct, Heidelberg, Germany, 12–16 September 2016; ACM: New York, NY, USA, 2016; pp. 846–850. [Google Scholar]
  88. Harnett, C.K.; Zhao, H.; Shepherd, R.F. Stretchable Optical Fibers: Threads for Strain-Sensitive Textiles. Adv. Mater. Technol. 2017, 2, 1700087. [Google Scholar] [CrossRef]
  89. Leber, A.; Cholst, B.; Sandt, J.; Vogel, N.; Kolle, M. Stretchable Thermoplastic Elastomer Optical Fibers for Sensing of Extreme Deformations. Adv. Funct. Mater. 2018, 29, 1802629. [Google Scholar] [CrossRef]
  90. Tian, R.; Ruan, K.; Li, L.; Le, J.; Greenberg, J.; Barbat, S. Standardized evaluation of camera-based driver state monitoring systems. IEEE/CAA J. Autom. Sin. 2019, 6, 716–732. [Google Scholar] [CrossRef]
  91. Skach, S.; Stewart, R.; Healey, P.G.T. Smart Arse: Posture Classification with Textile Sensors in Trousers. In Proceedings of the 20th ACM International Conference on Multimodal Interaction, Boulder, CO, USA, 16–20 October 2018; ACM: New York, NY, USA, 2018; pp. 116–124. [Google Scholar]
  92. Braun, A.; Rus, S.; Majewski, M. Invisible Human Sensing in Smart Living Environments Using Capacitive Sensors. In Ambient Assisted Living: 9. AAL-Kongress, Frankfurt/M, Germany, 20–21 April 2016; Wichert, R., Mand, B., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 43–53. ISBN 9783319523224. [Google Scholar]
  93. Rus, S.; Braun, A.; Kuijper, A. E-Textile Couch: Towards Smart Garments Integrated Furniture. In Proceedings of the Ambient Intelligence, Porto, Portugal, 21–23 June 2017; pp. 214–224. [Google Scholar]
  94. Skach, S.; Healey, P.G.T.; Stewart, R. Talking Through Your Arse: Sensing Conversation with Seat Covers. In Proceedings of the 39th Annual Meeting of the Cognitive Science Society, CogSci 2017, London, UK, 16–29 July 2017. [Google Scholar]
  95. Zhang, Y.; Laput, G.; Harrison, C. Electrick: Low-Cost Touch Sensing Using Electric Field Tomography. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; ACM: New York, NY, USA, 2017; pp. 1–14. [Google Scholar]
  96. Zhang, Y.; Xiao, R.; Harrison, C. Advancing Hand Gesture Recognition with High Resolution Electrical Impedance Tomography. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology, Tokyo, Japan, 16–19 October 2016; ACM: New York, NY, USA, 2016; pp. 843–850. [Google Scholar]
  97. Rodrigues, J.G.P.; Vieira, F.; Vinhoza, T.T.V.; Barros, J.; Cunha, J.P.S. A non-intrusive multi-sensor system for characterizing driver behavior. In Proceedings of the 13th International IEEE Conference on Intelligent Transportation Systems, Funchal, Portugal, 19–22 September 2010; pp. 1620–1624. [Google Scholar]
  98. Mimbela, L.E.Y.; Klein, L.A. Summary of Vehicle Detection and Surveillance Technologies Used in Intelligent Transportation Systems; Technical Report; Joint Program Office for Intelligent Transportation Systems: Washington, DC, USA, 2007. Available online: https://www.fhwa.dot.gov/policyinformation/pubs/vdstits2007/vdstits2007.pdf (accessed on 27 August 2019).
  99. Wood, J.S.; Zhang, S. Identification and Calculation of Horizontal Curves for Low-Volume Roadways Using Smartphone Sensors. Transp. Res. Rec. 2018, 2672, 1–10. [Google Scholar] [CrossRef] [Green Version]
  100. Lee, J.D.; See, K.A. Trust in automation: Designing for appropriate reliance. Hum. Factors 2004, 46, 50–80. [Google Scholar] [CrossRef] [PubMed]
  101. Ghazizadeh, M.; Lee, J.D.; Boyle, L.N. Extending the Technology Acceptance Model to assess automation. Cogn. Technol. Work 2012, 14, 39–49. [Google Scholar] [CrossRef]
  102. Ben Shalom, D.; Mostofsky, S.H.; Hazlett, R.L.; Goldberg, M.C.; Landa, R.J.; Faran, Y.; McLeod, D.R.; Hoehn-Saric, R. Normal Physiological Emotions but Differences in Expression of Conscious Feelings in Children with High-Functioning Autism. J. Autism Dev. Disord. 2006, 36, 395–400. [Google Scholar] [CrossRef] [PubMed]
  103. Groden, J.; Goodwin, M.S.; Baron, M.G.; Groden, G.; Velicer, W.F.; Lipsitt, L.P.; Hofmann, S.G.; Plummer, B. Assessing Cardiovascular Responses to Stressors in Individuals With Autism Spectrum Disorders. Focus Autism Other Dev. Disabl. 2005, 20, 244–252. [Google Scholar] [CrossRef]
  104. Welch, K.C. Physiological signals of autistic children can be useful. IEEE Instrum. Meas. Mag. 2012, 15, 28–32. [Google Scholar] [CrossRef]
  105. Green, D.; Baird, G.; Barnett, A.L.; Henderson, L.; Huber, J.; Henderson, S.E. The severity and nature of motor impairment in Asperger’s syndrome: A comparison with specific developmental disorder of motor function. J. Child Psychol. Psychiatry 2002, 43, 655–668. [Google Scholar] [CrossRef]
  106. Schultz, R.T. Developmental deficits in social perception in autism: The role of the amygdala and fusiform face area. Int. J. Dev. Neurosci. 2005, 23, 125–141. [Google Scholar] [CrossRef]
  107. Langdell, T. Face Perception: An Approach to the Study of Autism. Ph.D. Thesis, University of London, London, UK, 1981. [Google Scholar]
  108. Schopler, E.; Mesibov, G.B. Social Behavior in Autism; Springer Science & Business Media, Plenum Press: New York, NY, USA, 1986; ISBN 9780306421631. [Google Scholar]
  109. Baio, J.A. Prevalence of autism spectrum disorder among children aged 8 years—Autism and developmental disabilities monitoring network, 11 sites, United States, 2010. MMWR Surveill. Summ. 2014, 63, 1–21. [Google Scholar]
  110. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders (DSM-5®); American Psychiatric Pub.: Washington, DC, USA, 2013; ISBN 9780890425572. [Google Scholar]
  111. Prendinger, H.; Mori, J.; Ishizuka, M. Using human physiology to evaluate subtle expressivity of a virtual quizmaster in a mathematical game. Int. J. Hum. Comput. Stud. 2005, 62, 231–245. [Google Scholar] [CrossRef]
  112. Ernsperger, L. Keys to Success for Teaching Students with Autism; Future Horizons: Arlington, TA, USA, 2002; ISBN 9781885477927. [Google Scholar]
  113. Seip, J.-A. Teaching the Autistic and Developmentally Delayed: A Guide for Staff Training and Development; Gateway Press: Delta, BC, USA, 1992. [Google Scholar]
  114. Wieder, S.; Greenspan, S. Can children with autism master the core deficits and become empathetic, creative, and reflective? J. Dev. Learn. Disord. 2005, 9, 39–61. [Google Scholar]
  115. Liu, C.; Conn, K.; Sarkar, N.; Stone, W. Online Affect Detection and Robot Behavior Adaptation for Intervention of Children with Autism. IEEE Trans. Rob. 2008, 24, 883–896. [Google Scholar]
Figure 1. Example of a body-worn pressure sensor (a) vs. a vehicle surface pressure sensor (b).
Figure 1. Example of a body-worn pressure sensor (a) vs. a vehicle surface pressure sensor (b).
Safety 05 00072 g001
Figure 2. The affective circumplex with adaptation of an overlay of four colored ovals, representing which quadrants likely relate to safer or riskier driving behaviors. Adapted from [23]. Representative studies to support the effect on driver behavior are covered in Section 2 and Table 1.
Figure 2. The affective circumplex with adaptation of an overlay of four colored ovals, representing which quadrants likely relate to safer or riskier driving behaviors. Adapted from [23]. Representative studies to support the effect on driver behavior are covered in Section 2 and Table 1.
Safety 05 00072 g002
Figure 3. Relationship between driver behavior through intentional and unintentional signals, driver affect detection, sensor types, and safety interventions. Solid borders indicate the topics emphasized in this review.
Figure 3. Relationship between driver behavior through intentional and unintentional signals, driver affect detection, sensor types, and safety interventions. Solid borders indicate the topics emphasized in this review.
Safety 05 00072 g003
Table 1. Representative studies on the relationship between affect detection and driver behavior.
Table 1. Representative studies on the relationship between affect detection and driver behavior.
Affective StateReference(s)Effect on Driver Behavior
Anger and Anxiety (vs. contempt and fright) Roidl et al. (2014) [32]Higher driving speed, stronger acceleration, speed limit violation for a longer time
Stress Westerman and Haigney (2000) [43]Higher (self-reported) lapses, errors, and violations
Happiness and Calmness (vs. anger) Steinhauser et al. (2018) [44]Lower driving speed and speed variability, longer distance to lead car
Fatigue Philip et al. (2005) [46]More inappropriate line crossings
Drowsiness Lee et al. (2016) [48]More near-crash events and lane excursions
Table 2. Wearable/in-vehicle physiological sensors and the connection between the sensed signal and affective state.
Table 2. Wearable/in-vehicle physiological sensors and the connection between the sensed signal and affective state.
Physiological SensorReferenceAffective State(s) SensedScope and Context
(Driving Only)
Heart rate, galvanic skin response (GSR) wearable biomedical sensors Healey and Picard (2005) [49]StressDriving test on roads: 24 subjects on at least a 50 min route
GSR, SpO2, respiration, and electrocardiogram (ECG) wearable biomedical sensorsRanjan Singh and Banerjee (2010) [66] Fatigue, stressDriving test on roads: 14 subjects including taxi drivers
Heart rate and GSR wearable biomedical sensors, plus wearable biofeedback using a visible indicatorMacLean et al. (2013) [68]Stress, emotional regulationSimulated driving test: 11 subjects with driving experience and no history of epilepsy or autism
Heart rate, GSR, and temperature from armband; Polar heart monitor chest strapNasoz et al. (2010) [69]Fear, frustration, boredomSimulated driving test: 41 subjects
Eye blink rate, from smart glasses-correlated with braking response time and lane deviationHe et al. (2017) [72] DrowsinessSimulated driving test: 23 subjects.
Heart rate variability, from ECG electrodes made from conductive fabric on steering wheel Yu (2009) [75]Fatigue, drowsinessSimulated driving test: 2 subjects
Heart rate using ECG electrodes on body, and eye movementOliveira et al. (2018) [81]DrowsinessDriving test on roads: 20 subjects
GSR, SpO2, Respiration, and ECG embedded in vehicle seatBaek et al. (2009) [79]Task-induced stressDriving test on roads: 4 subjects with at least 5 years driving experience
Facial electromyography (EMG)Saikalis and Lee (2019) [67]AngerSimulated driving test: 11 subjects
Electroencephalography (EEG) and heart rateHassib, Braun, Pfleging, and Alt, (2019) [82]Negative emotions induced by musicSimulated driving test: 12 subjects
Capacitive coupled ECG embedded in back support of vehicle seatBhardwaj and Balasubramanian (2019) [80]FatigueSimulated driving test: 20 male subjects
Table 3. Emerging sensor technologies that can capture driving-relevant user interaction signals.
Table 3. Emerging sensor technologies that can capture driving-relevant user interaction signals.
Interaction Sensor FormatReferenceInteraction Category and Possible Affective State(s)Context of Study (Driving/Other)
Wrist-worn accelerometry on both driving handsBi et al. (2017) [83]Handling secondary tasks (texting, eating), distraction, drowsinessRoad driving tests with 6 subjects, 75 different trips
Capacitive proximity sensors in vehicle seatsDurgam and Sundaram (2019) [85]Driver posture, sudden braking, panicOther: validating occupant position in video vs sensor data
Capacitive proximity sensors in vehicle headrestsZiraknejad et al. (2015) [86]Head position, drowsinessOther: validating head position detection in lab tests
Wearable capacitive pressure sensorPouryazdan et al. (2016) [87]Fidgeting, inattentionOther: detecting restless leg motion
Stretchable optical strain sensors in athletic tapeHarnett et al. (2017) [88]Muscle tension, stressOther: detecting weight bearing activity in lab tests, proof of concept
Stretchable optical strain sensors in glovesLeber et al. (2018) [89]Hand motion, distractionOther: detecting hand configuration in lab tests, proof of concept
Resistive textile pressure sensors in trousersSkach et al. (2018) [91]Body posture, social behavior, engagementOther: Classification of 19 different postures and gestures, 36 subjects
Resistive foam pressure sensors in an office chair back and seatSkach et al. (2017) [94]Seated body position, social behavior, engagementOther: Correlation of body position and speaking role during conversation, 27 subjects
Electrical impedance tomography touch detection on 3D objectsZhang et al. (2017) [95]Hand motion/grip shape, coordination level related to alertnessOther: User interface for computers, games, toys: demonstration
Wrist-worn electrical impedance tomography sensorZhang et al. (2016) [96]Hand positions, excitement/hostilityOther: classifying hand gestures

Share and Cite

MDPI and ACS Style

Welch, K.C.; Harnett, C.; Lee, Y.-C. A Review on Measuring Affect with Practical Sensors to Monitor Driver Behavior. Safety 2019, 5, 72. https://0-doi-org.brum.beds.ac.uk/10.3390/safety5040072

AMA Style

Welch KC, Harnett C, Lee Y-C. A Review on Measuring Affect with Practical Sensors to Monitor Driver Behavior. Safety. 2019; 5(4):72. https://0-doi-org.brum.beds.ac.uk/10.3390/safety5040072

Chicago/Turabian Style

Welch, Karla Conn, Cindy Harnett, and Yi-Ching Lee. 2019. "A Review on Measuring Affect with Practical Sensors to Monitor Driver Behavior" Safety 5, no. 4: 72. https://0-doi-org.brum.beds.ac.uk/10.3390/safety5040072

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop