Next Article in Journal
Online Peer Counseling for Suicidal Ideation: Participant Characteristics and Reasons for Using or Refusing This Service
Previous Article in Journal
Social Support Is Related to the Use of Adaptive Emotional Regulation Strategies in Ecuadorian Adolescents in Foster Care
 
 
Please note that, as of 22 March 2024, Psych has been renamed to Psychology International and is now published here.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Associations between Cognitive Concepts of Self and Emotional Facial Expressions with an Emphasis on Emotion Awareness

1
Faculty of Psychology, Sigmund Freud University, Vienna 1020, Austria
2
Faculty of Medicine, Sigmund Freud University, Vienna 1020, Austria
3
School of Psychology, Centre for Translational Neuroscience and Mental Health Research, University of Newcastle, Callaghan 2308, Australia
*
Author to whom correspondence should be addressed.
Submission received: 5 March 2021 / Revised: 8 April 2021 / Accepted: 19 April 2021 / Published: 27 April 2021

Abstract

:
Recognising our own and others’ emotions is vital for healthy social development. The aim of the current study was to determine how emotions related to the self or to another influence behavioural expressions of emotion. Facial electromyography (EMG) was used to record spontaneous facial muscle activity in nineteen participants while they passively viewed negative, positive and neutral emotional pictures during three blocks of referential instructions. Each participant imagined themself, another person or no one experiencing the emotional scenario, with the priming words “You”, “Him” or “None” presented before each picture for the respective block of instructions. Emotion awareness (EA) was also recorded using the TAS-20 alexithymia questionnaire. Corrugator supercilii (cs) muscle activity increased significantly between 500 and 1000 ms post stimulus onset during negative and neutral picture presentations, regardless of ownership. Independent of emotion, cs activity was greatest during the “no one” task and lowest during the “self” task from less than 250 to 1000 ms. Interestingly, the degree of cs activation during referential tasks was further modulated by EA. Low EA corresponded to significantly stronger cs activity overall compared with high EA, and this effect was even more pronounced during the “no one” task. The findings suggest that cognitive processes related to the perception of emotion ownership can influence spontaneous facial muscle activity, but that a greater degree of integration between higher cognitive and lower affective levels of information may interrupt or suppress these behavioural expressions of emotion.

1. Introduction

The ability to recognise and attend to other people’s emotions is essential for healthy social integration and general well-being. Deficits in this ability are observed in virtually all forms of psychopathology including autism and schizophrenia, to name only a few [1]. Recognising other people’s emotions also requires the more fundamental ability to discriminate the self from others in order to identify and contrast one’s own emotional experiences from another’s experiences. Interestingly, deficits in self and other discriminatory processing are also highly symptomatic of social-emotional disorders [1].
In line with these observations, research has shown that brain activity associated with self and other discrimination is modulated by emotional context, suggesting that the two forms of information processing in the brain are intertwined [2,3,4,5,6,7]. However, little research has been carried out to determine how self and other discriminatory neural processes relate to emotion recognition.
Research on the embodiment of emotion follows the idea that the recognition of emotion in another person occurs via the simulation of that person’s observed emotional cues, leading to the representation of that person’s emotional state in the observer through physical experience [8,9]. Facial mimicry is a robust aspect of emotion embodiment involving the simulation of an observed emotion through congruent facial expressions. Studies have repeatedly shown that when exposed to faces of conspecifics, individuals spontaneously and rapidly mimic their facial expression [10,11,12,13]. The effect has been demonstrated in the absence of conscious perception of the conspecific’s face [14], and there is evidence to suggest that these spontaneous facial reactions play a facilitative role in emotion recognition, whereby preventing mimicry in certain facial muscle groups leads to a substantial drop in the ability to recognise another’s emotions from their facial expression [15].
Embodiment theory states that spontaneous facial reactions modulate an individual’s perception of another person’s emotional experience through motor and sensory neural areas dedicated to facial perception, which overlap to synchronously map others’ facial expressions onto their own [16,17]. This process consequently leads to a shared perception of others’ emotional experiences. In support of this theory, brain imaging studies have shown that recognition of emotion in others and recognising one’s own experience of emotion activate common sets of neural structures [7,16,18,19]. These findings further show that neural structures involved in discriminating between self and others are also involved in the perception of our own and others’ emotional experiences [20].
However, studies using electroencephalography to determine how the emotional self is differentiated from others have shown that self- and other-referenced emotions can also be discriminated by the brain in the absence of an observed other and even without explicit instructions to give effortful focus on another person’s emotional experience [2,4,6]. For example, in Herbert’s studies, merely presenting a personal pronoun or article combined with an emotive noun such as “my fear”, “his fear” or “the fear” evoked significantly different neural activity as a function of who owned the emotion. These more automatically driven discriminations were localised to neural activity within the amygdala, insula and anterior cingulate brain regions [5], all of which have previously been implicated in effortful discriminations of self- and other-referenced emotions.
To this extent, we were interested in whether spontaneous facial reactions would still differ in the expected direction between self- and other-referenced emotional experiences in the absence of an observed other person’s emotional expression (i.e., no face to mimic). Spontaneous facial reactions can be elicited in the absence of an observed person altogether. Other studies have shown that in the same way observed emotional faces activate congruent spontaneous facial reactions, so too do emotional pictures, sounds and words [10,11,12,13].
In the current study, we used an adapted version of the word paradigm used in Herbert et al.’s series of studies of self and other referential processing (e.g., [4,5]). Herbert presented emotional words paired with a pronoun (e.g., my fear, his happiness). Here we presented pictures of emotional scenes preceded by the word “You”, “Him” or “None” (None to mean “no one”) to denote who the emotional scenario was to be referenced to. The reason for using emotional pictures rather than emotional words was to better simulate evolutionarily relevant emotional context. Examples of the emotional scene stimuli included snakes, tornados and aimed guns (unpleasant); puppies, tropical beaches and appetising food (pleasant); and envelopes, power cords and furniture (neutral). The experiment was a block design with three conditions. In one condition the word “You” was presented before each picture, and participants imagined that the emotional scenario depicted in each picture was happening to them (self-reference condition). In a second condition, the word “Him” was shown before each picture, and participants were instructed to imagine that the emotional scenario was happening to someone else (a person described to the participant; other-reference condition). In a third condition, the word “None” was shown before each picture with no reference instructions (no-reference condition).
Previous research where participants have observed another experiencing an emotion [21] or observed another directing their emotional expression towards or away from the self [22] have shown that spontaneous facial reactions decrease as self-relevance of the emotional event decreases. Therefore, we hypothesised that if spontaneous facial reactions are elicited in the absence of an observed other, then spontaneous facial reactions should be lower in the other-referenced condition compared with the self-referenced condition. Also in line with this theory, emotional scenarios in the no-reference condition would be least relevant to the self; hence, we expect the lowest spontaneous facial reactions for these emotional pictures.
Closely related to the ability to comprehend owned and others’ emotions is the degree to which one is capable of understanding emotional experiences. Emotion awareness (EA) is a personality trait reflecting the conscious experience of emotion including the ability to cognitively and semantically process, identify and express emotions, the awareness of bodily sensations, and the ability to infer the emotions of others [23]. Although EA has been implicated in emotion recognition ability [24,25,26], it is not known whether EA modulates spontaneous facial reactions; therefore, we introduced EA as an exploratory independent variable in our analyses.

2. Materials and Methods

2.1. Participants

Participants were 22 undergraduate university students sourced from the University of Newcastle volunteer database, with those enrolled in first year psychology courses receiving course credit for participation. Ethics approval was obtained for the study from the University of Newcastle Human Research Ethics Committee. Data from three participants were excluded from the final analysis due to poor signal-to-noise ratio in physiological recordings. The remaining 19 participants (11 females) were right-handed, non-smoking native speakers of English aged between 17 and 29 years (M = 20.89, SD = 3.40). They had no known history of neuropathology or emotional disorder, and were not taking any central nervous system targeted medication such as antidepressants or stimulants at the time of recruitment.

2.2. Stimuli

The stimuli consisted of 90 unpleasant, 90 neutral and 90 pleasant pictures from the Geneva Affective Picture Database [27] and the International Affective Picture System [28]. Pictures were chosen for the experiment based on matching levels of pre-evaluated valence and arousal ratings collected in a pilot rating study involving an independent group of 42 participants (23 females) rating a larger pool of stimuli [13]. Table 1 lists the mean valence and arousal ratings for the experimental stimuli. Examples of unpleasant stimuli were disfigured bodies, snakes, spiders and violence. Pleasant stimuli were nature scenes, appetising food and erotic scenes depicting a male and female embrace. Neutral stimuli ranged from ordinary household objects to plain nature scenes and low arousing pictures of snakes.
An analysis of variance carried out for the factor “valence” showed that mean ratings for unpleasant, neutral and pleasant pictures were significantly different from one another (F (2, 267) = 2278.70, p < .001, η2 = .95). Unpleasant pictures were rated as most unpleasant, pleasant pictures were rated as most pleasant, and neutral picture ratings were situated midway. For the factor “arousal”, the analysis of variance showed no significant main effect of valence (p = .116). Hence, arousal levels were no different across the three emotion stimulus categories according to self-report ratings. After randomly allocating stimuli to three reference groups, Analysis of Variance (ANOVA) for the factors “valence” and “arousal” were carried out with the additional factor “reference condition”. No significant effect of reference condition was found (all p-values > .05); hence, valence and arousal ratings were also no different across reference conditions.

2.3. Procedure and Tasks

The three reference groups corresponded to three blocked and counterbalanced tasks. For the self-reference task, participants were instructed to imagine they were experiencing the scenario in each presented picture. For the other-reference task, they were to imagine that a person named Tom was experiencing the scenario. This fictional person was introduced to the participant at the beginning of the task, and was simply described as being a male university student named Tom. In the no-reference task, participants were instructed to simply view the picture. The emotional stimuli were equally split into three reference groups via random allocation. The stimuli within each reference group did not differ across participants; however, the order of stimulus presentations within groups always changed.
Given that pronouns have consistently been shown to evoke distinct referential differences [5,6,29,30,31], pronouns were used to assign ownership of emotions. Each trial consisted of a 5000 ms stimulus presentation preceded by the word “You”, “Him” or “None” for the corresponding task block to continually reinforce the referential task instructions (Figure 1). After each stimulus presentation, a modified version of the Self-Assessment Manikin [32] was used to assess participants’ emotional valence and arousal reactions to the scenarios.
When each scale appeared on the monitor, participants used their right hand to press a number between 1 and 9 on a keyboard corresponding to their degree of felt emotion. For the valence scale, 1 = “very unhappy” and 9 = “very happy”. For the arousal scale, 1 = “very calm” and 9 = “very excited”. Participants were given explicit standardised instructions for the meaning and use of the ratings scales, and were given six practice trials to become familiar with the method of rating and the trial sequence of events before commencing with the tasks. During the experiment, participants sat in a reclining chair under dim lighting in front of a display monitor positioned to allow 9.9° × 8.5° of visual angle for stimulus presentations, and were given a short break midway and at the end of each task block.
Before starting the experiment, participants completed a standard demographics questionnaire, the Toronto Alexithymia Scale (TAS-20) [33,34] and the Big Five Inventory (BFI). The TAS-20 consists of 20 short phrase self-report items with easily accessible vocabulary, and produces a score between 1 (high EA) and 100 (low EA). TAS-20 scores ranged from 36 to 74 (M = 46.21, SD = 9.83). One outlier greater than 3 SDs from the mean was identified, but not excluded from the data set to preserve statistical power in physiological data analyses. Participants were assigned to the high or low EA group based on the median score of 43. Hence, a score less than 43 equated to high EA (n = 11), while a score of 43 or greater equated to low EA (n = 8).

2.4. Facial EMG Measures

The corrugator supercilii (cs) muscles, which furrow the eyebrows, were used to reference muscle potential changes corresponding to unpleasant stimuli, and the zygomaticus major (zm) muscles, which lift the cheeks and lips, were used to reference muscle potential changes corresponding to pleasant stimuli. EMG of the corrugator and zygomaticus muscles was done using a Nexus 10 wireless recording device connected via Bluetooth to a PC laptop, and output measurements were recorded using Biotrace Software (Mind-media.net). Further procedural details can be found in [13]. For EMG, a single 1000 ms epoch time-locked to the onset of each emotional picture stimulus was extracted from the continuous recordings and divided into four 250 ms time intervals by averaging across data points. Trial-by-trial variance was also examined using the procedure outlined in [13]. For each time interval, the grand mean of each of the nine conditions was subject to further statistical analyses to assess for significant differences.

2.5. Data Analysis

The experiment included two within-subjects factors: “valence” (3 levels: unpleasant, neutral, pleasant) × “reference” (3 levels: self, other, no one), and one between-subjects factor “emotion awareness” (low, high), which were assessed using repeated-measures ANOVAs for each dependent measure of interest. For corrugator and zygomatic data analyses, the additional within-subjects factor time interval (4 levels) was employed to assess the effect of time over the first second of stimulus viewing. The intervals corresponded to the first 250, 250–500, 500–750 and 750–1000 ms post stimulus onset. For the behavioural data, a single mean valence and arousal rating was calculated from all trials under each experimental condition. Means for each rating scale were submitted to separate 3 (Reference: self, other, no one) × 3 (Emotion: unpleasant, neutral, pleasant) × 2 (EA: high, low) repeated-measures ANOVAs. For all analyses, repeated (for time interval comparisons) and simple contrasts were used to determine the direction of significant main effects (p < .05), Greenhouse–Geisser corrections were used for sphericity violations, and Pearson’s correlation coefficient (r) was used to measure effect sizes. All paired samples and independent samples t-tests were conducted with a corrected alpha criterion for family-wise error by dividing p by the number of tests carried out.

3. Results

3.1. Facial EMG Measures and Skin Conductance

For corrugator activity, reference condition and EA produced significant main effects and interacted significantly during all four time intervals. The corresponding statistical values are listed in Table 2.
Instead of interpreting contrasts for individual time intervals, the ANOVA was carried out again post hoc with the additional factor “time interval” (four levels) to determine whether corrugator activity changed significantly over time as a function of the dependent factors. Time interval did not significantly interact with reference condition, EA or their combined influence (all p-values > .05); therefore, mean corrugator activity was collapsed across time intervals to generate a single grand average for the 1000 ms viewing period post stimulus onset for each factor (Figure 2). Simple contrasts showed that participants exhibiting low EA produced significantly greater corrugator activity than those exhibiting high EA when pictures were assigned no-reference. This difference was significantly greater than when pictures were referenced to the self (F (1, 17) = 4.44, p = .05, η2 = .21) and to another person (F (1, 17) = 6.31, p = .022, η2 = .27). Three independent samples t-test comparisons of the EA groups, one for each reference condition, further confirmed that corrugator activity produced by the low EA group was significantly greater only when pictures were assigned no-reference (t (17) = −2.88, p = .01) (using family-wise corrected alpha of .017).
As expected, the main effect of emotion category became strongly significant at the third and fourth time windows (see Table 2 for statistical values). Contrasts for each time interval showed no difference in corrugator activity produced by unpleasant and neutral pictures (p-values > .05), but that both unpleasant and neutral pictures produced significantly greater activity than did pleasant pictures (for intervals 3 and 4, respectively, contrasts for unpleasant pictures were F (1, 17) = 7.70, 8.82, p = .013, .009, η2 = .31, .34, and contrasts for neutral pictures were F (1, 17) = 13.65, 12.92, p = .002, .002, η2 = .45, .43). The post hoc ANOVA across time intervals further revealed a significant interaction between emotion category and time interval (F (1.42, 24.05) = 10.86, p = .001, η2 = .39, after corrections for sphericity violations χ 2 = 137.86, p = <.001, η2 = .24). Figure 3 shows mean corrugator activity across time. Repeated contrasts across the second, third and fourth intervals showed that from 500 to 750 ms and from 750 to 1000 ms, the difference in activity between pleasant and unpleasant pictures increased significantly (p = .002, .011, respectively), as did the difference in activity between pleasant and neutral pictures (p = .001, .03, respectively). However, no significant difference in activity occurred between unpleasant and neutral pictures over time (p > .05). No other significant effects of corrugator activity were found (all p-values > .05), and no significant effects were found for zygomatic activity all p-values > .05).

3.2. Self-Reported Ratings

3.2.1. Valence

As expected, the ANOVA yielded a significant main effect of emotion category (F (2, 34) = 81.18, p < .001, η2 = .83) (Figure 4). Unpleasant pictures (M = 2.93, SE = .19) were rated as significantly more unpleasant than neutral pictures (M = 4.44, SE = .19; F (1, 17) = 31.82, p < .001, η2 = .65) and pleasant pictures (M = 6.78, SE = .19; F (1, 17) = 110.35, p < .001, η2 = .87). Meanwhile, pleasant pictures were rated as significantly more pleasant than neutral pictures (F (1, 17) = 75.82, p < .001, η2 = .82). No other significant effects were found (all p-values > .05).

3.2.2. Arousal

The ANOVA yielded a significant main effect of reference condition (F (2, 34) = 5.93, p = .006, η2 = .26) and an interaction between reference condition and emotion category (F (4, 68) = 5.12, p = .001, η2 = .23) (Figure 5). Neutral pictures were rated as less arousing than unpleasant pictures when assigned no-reference compared with self-reference (F (1, 17) = 4.73, p =.044, η2 = .22) or other-reference (F (1, 17) = 22.43, p < .001, η2 = .57). Similarly, neutral pictures were less arousing than pleasant pictures when assigned no-reference compared with self-reference (F (1, 17) = 9.81, p = .006, η2 = .37) or other-reference (F (1, 17) = 22.14, p < .001, η2 = .57). No other significant effects were found (all p-values > .05).

4. Discussion

The aim of the current experiment was to determine whether cognitive concepts of self, other and no one lead to different patterns of spontaneous facial reactions. The study was inspired by research showing that spontaneous facial reactions decrease as self-relevance of an emotional event decreases, specifically when observing another person experiencing an emotion [21,22]. The study aimed to further this line of research by investigating whether the mere activation of cognitive concepts of self and other would evoke responses similar to when we physically observe another person’s emotional responses. Based on research showing that self-relevant emotional processing leads to greater mimicry responses [21,22], it was hypothesised that viewing self-referenced emotional scenarios would elicit stronger spontaneous facial reactions than when viewing emotional scenarios referenced to another or to no one.
These hypotheses were not supported by the data. Firstly, it was not the case that self-referenced or even other-referenced emotion processing leads to enhanced spontaneous mimicry. In terms of corrugator activity, we found that spontaneous mimicry was greatest in the no-reference condition, whereby emotional stimuli were viewed with no explicit reference to the self or another. Contrary to our predictions, we did not find that pleasant stimuli activated spontaneous zygomatic activity. Others [35] also found that high but not low arousal pleasant stimuli modulated zm activity, while both high and low arousal unpleasant stimuli modulated cs activity. Given that the stimuli in the current experiment were low in arousal, our results match those of Fujimura [35] in that low arousing pleasant stimuli do not evoke spontaneous zygomatic activity. Collectively, the data suggest that emotion recognition mechanisms thought to underlie spontaneous mimicry do not seem to be directly associated with higher-order cognitive concepts of the self or emotion ownership.
Most of the research to date has attempted to understand the nature of facial mimicry in emotion perception by examining individuals’ facial reactions to emotionally expressive faces or gestures. However, these paradigms do not distinguish between mimicry as a function of owned emotional experiences and as a function of understanding another person’s emotional state. Most research investigating how spontaneous facial reactions are associated with perceiving self and others’ emotions has been carried out by presenting individuals with images or videos of another person with instructions to give effortful focus on the emotional experience of the observed person or to focus on their own emotional experience (e.g., [21,36]). For example, it was investigated whether mimicry was induced when adopting the perspective of another person by showing participants video clips of patients experiencing painful sonar treatment via headphones and instructing them to imagine how the patient felt in one condition, or to imagine how they would feel if they were the patient in a second condition [21]. In the same study [21], participants were specifically instructed to think about how the other person felt during the other-referenced condition, and in a second round of picture stimulus viewing, to rate the other person’s emotion using a self-report scale. In the current study, participants were also required to use an emotion rating scale, but in all three conditions participants were to rate their own emotional reaction to each scenario, which they did immediately after viewing. Hence, the conditions differed according to who the participant imagined to be experiencing the scenario—the self, another person or no one—but did not differ according to whose emotional reaction they were effortfully focusing on.
Spontaneous facial reactions of participants have been investigated while they watched video clips of people displaying angry bodily gestures either directed towards the camera (i.e., directed towards the self) or directed at a 45-degree angle relative to the camera (i.e., directed away from the self) to induce the idea that the angry person was facing another person [22]. The faces of the people in the video clips were blurred to differentiate direct facial mimicry from mimicry related to the bodily gestures. The authors found that mimicry reactions were greater during self-directed compared with other-directed video clips; however, it is not clear as to whether facial mimicry is functionally involved in understanding another person’s emotional state because the participants’ facial reactions could have been attributed to processing of the gestures as a primary source of emotion (i.e., could this person be a threat to me?) in both conditions, with less threat elicited in the other-directed condition rather than as a secondary source (i.e., what emotion is the person conveying?). These studies have reported stronger mimicry during self-referenced emotional appraisals compared with other-referenced appraisals, leading to the theory that mimicry responses increase when emotional events become more self-relevant [21].
In terms of the function of spontaneous mimicry, the results of the current experiment support the theory that the immediacy or salience of the emotion-inducing environment is a mediating factor of whether or not spontaneous mimicry is activated. Contrastingly, the re-activation of cognitive concepts of emotion, such as in relation to the self or another person, does not seem to induce such an effect in the physical absence of a stimulus, suggesting a dissociation between physical motor behaviour and cognitive concepts of emotion, which does not support theories of embodied cognition. The current results in fact showed the greatest mimicry responses when subjects were instructed to not think about themself or anyone else when viewing the emotional stimuli. This not only supports the interpretation that the immediacy of the emotional stimulus is an important factor in inducing spontaneous mimicry, but also suggests that introducing interruptive processing and thereby increasing cognitive load, such as when instructing participants to additionally imagine themself or someone else in the emotional scenario, also interrupts spontaneous mimicry.
Corrugator activity was also enhanced not only for unpleasant stimuli, but also for neutral stimuli. This effect has been reported in other studies as well [13]. Other authors [37] also reported that neutral faces were not used in statistical analyses because participants reported negative feelings during the exposures. Recently, we found that neutral stimuli evoked enhanced corrugator activity, particularly neutral facial expressions. That neutral stimuli evoke more negative rather than positive-related patterns of facial activity is thought to be due to neutral faces being perceived in a negative context rather than simply representing an emotionally void canvas. The fact that often neutral is similar to unpleasant could further mean that “neutral” equates to “boring”. If we consider that young people often make up the sample population of these types of psychology experiments, this emerging pattern of results in emotion research may be indicative of the changing modern young brain. These days, a low-stimulation level might indeed be perceived as unpleasant, particularly for modern information technology-overusing young people (see [38]).
At the low or pathological end of emotion awareness is alexithymia (“no words for feelings”), a condition often associated with psychiatric and neurological disorders. Recently, a meta-analysis of 15 neuroimaging studies of alexithymic patients [39] showed distinctly less activity in the supplementary motor and premotor areas of the brain compared with healthy controls. These areas are thought to be involved in spontaneous facial reactions [40], but the precise mechanisms of motor cortical activity into facial mimicry activity is not known. Hence, we were interested in whether emotion awareness ability may be a predictor of mimicry responses. In the current study, corrugator recordings revealed that emotion awareness does interact with spontaneous mimicry. We found that participants who scored low in emotion awareness produced greater spontaneous corrugator responses than did people who scored high in emotion awareness. These findings align with past research on physiological correlates of emotion awareness. This research has shown that low emotion awareness corresponds to greater externalised emotional responses while higher emotion awareness corresponds to greater internalised emotional activity, including greater cognitive emotional thinking and pre-frontal activity, compared with those with low emotion awareness [41,42,43]. Along this line, if we consider that those with high emotion awareness are better at empathising, and that empathy involves the recruitment of higher-order cognitive processing, then given our findings that cognitive activity can supress or interrupt mimicry, it is logical to expect that people with greater emotion awareness display relatively lower spontaneous activation levels.

5. Conclusions

Our findings support the idea that cognitive processes related to the perception of emotion ownership can influence spontaneous facial muscle activity. Greater input of cognitive information may suppress behavioural expressions of emotion. Overall, the findings support current theories that brain processes involved in self/other discrimination are intertwined with emotion processing and behavioural expressions of emotion. Crucially, multiple neuropathological conditions are characterised by disordered concepts of self and deficits in social-emotional functions, and there is now strong justification for exploring the nature of these behavioural and brain processes in clinical populations. This can have implications for future research and possibly lead to applications in terms of dealing with clinical populations. A well-defined understanding of interactions between self-referential processing and emotion will help to improve diagnosis and treatment of a disordered self, as occurring in mental disorders such as schizophrenia and others, as well as treatment of depression, where emotion plays a dominant role. In the long run, it will also be possible to investigate the different stages of self-referential processing in relation with different clinical symptoms, and its neurophysiological correlates might become markers of self-disorders (e.g., [44]).

Author Contributions

P.W. was involved in designing and implementing the experiment in addition to manuscript writing and overall mentoring. A.M. collected all data and was involved in design and implementation of the experiment in addition to writing the manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of the University of Newcastle, Australia (H-2012-0229; approved on 25 March 2013).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Dimaggio, G.; Vanheule, S.; Lysaker, P.H.; Carcione, A.; Nicolò, G. Impaired self-reflection in psychiatric disorders among adults: A proposal for the existence of a network of semi-independent functions. Conscious. Cogn. 2009, 18, 653–664. [Google Scholar] [CrossRef] [PubMed]
  2. Fields, E.C.; Kuperberg, G.R. It’s all about you: An ERP study of emotion and self-relevance in discourse. Neuroimage 2012, 62, 562–574. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Fossati, P.; Hevenor, S.J.; Graham, S.J.; Grady, C.; Keightley, M.L.; Craik, F.; Mayberg, H. In search of the emotional self: An fMRI study using positive and negative emotional words. Am. J. Psychiatry 2003, 160, 1938–1945. [Google Scholar] [CrossRef]
  4. Herbert, C.; Herbert, B.M.; Ethofer, T.; Pauli, P. His or mine? The time course of self–other discrimination in emotion processing. Soc. Neurosci. 2011, 6, 277–288. [Google Scholar] [CrossRef]
  5. Herbert, C.; Herbert, B.M.; Pauli, P. Emotional self-reference: Brain structures involved in the processing of words describing one´s own emotions. Neuropsychologia 2011, 49, 2947–2956. [Google Scholar] [CrossRef] [PubMed]
  6. Herbert, C.; Pauli, P.; Herbert, B. Self-reference modulates the processing of emotional stimuli in the absence of explicit self-referential appraisal instructions. Soc. Cogn. Affect. Neurosci. 2010, 6, 653–661. [Google Scholar] [CrossRef] [Green Version]
  7. Ochsner, K.N.; Knierim, K.; Ludlow, D.H.; Hanelin, J.; Ramachandran, T.; Glover, G.; Mackey, S.C. Reflecting upon feelings: An fMRI study of neural systems supporting the attribution of emotion to self and other. J. Cogn. Neurosci. 2004, 16, 1746–1772. [Google Scholar] [CrossRef]
  8. Adolphs, R. Neural systems for recognising emotion. Curr. Opin. Neurobiol. 2002, 12, 169–177. [Google Scholar] [CrossRef]
  9. Niedenthal, P.M. Embodying emotion. Science 2007, 316, 1002–1005. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Dimberg, U. Facial reactions to facial expressions. Psychophysiology 1982, 19, 643–647. [Google Scholar] [CrossRef]
  11. Dimberg, U. Facial reactions: Rapidly evoked emotional responses. J. Psychophysiol. 1997, 11, 115–123. [Google Scholar]
  12. Lundqvist, L.-O.; Dimberg, U. Facial expressions are contagious. J. Psychophysiol. 1995, 9, 203–211. [Google Scholar]
  13. Mavratzakis, A.; Herbert, C.; Walla, P. Emotional facial expressions evoke faster orienting responses, but weaker emotional responses at neural and behavioural levels compared to scenes: A simultaneous EEG and facial EMG study. Neuroimage 2016, 124 Pt A, 931–946. [Google Scholar] [CrossRef]
  14. Dimberg, U.; Thunberg, M.; Elmehed, K. Unconscious facial reactions to emotional facial expressions. Psychol. Sci. 2000, 11, 86–89. [Google Scholar] [CrossRef]
  15. Oberman, L.M.; Winkielman, P.; Ramachandran, V.S. Face to face: Blocking facial mimicry can selectively impair recognition of emotional expressions. Soc. Neurosci. 2007, 2, 167–178. [Google Scholar] [CrossRef]
  16. Hennenlotter, A.; Schroeder, U.; Erhard, P.; Castrop, F.; Haslinger, B.; Stoecker, D.; Ceballos-Baumann, A.O. A common neural basis for receptive and expressive communication of pleasant facial affect. Neuroimage 2005, 26, 581–591. [Google Scholar] [CrossRef] [PubMed]
  17. Niedenthal, P.M.; Barsalou, L.W.; Winkielman, P.; Krauth-Gruber, S.; Ric, F. Embodiment in attitudes, social perception, and emotion. Personal. Soc. Psychol. Rev. 2005, 9, 184–211. [Google Scholar] [CrossRef] [PubMed]
  18. Spunt, R.P.; Lieberman, M.D. An integrative model of the neural systems supporting the comprehension of observed emotional behavior. Neuroimage 2012, 59, 3050–3059. [Google Scholar] [CrossRef]
  19. Wicker, B.; Keysers, C.; Plailly, J.; Royet, J.P.; Gallese, V.; Rizzolatti, G. Both of us disgusted in my insula: The common neural basis of seeing and feeling disgust. Neuron 2003, 40, 655–664. [Google Scholar] [CrossRef] [Green Version]
  20. Walla, P.; Panksepp, J. Neuroimaging helps to clarify brain affective processing without necessarily clarifying emotions. In Novel Frontiers of Advanced Neuroimaging; Fountas, K., Ed.; InTech: London, UK, 2013; pp. 93–118. [Google Scholar]
  21. Lamm, C.; Porges, E.; Cacioppo, J.; Decety, J. Perspective taking is associated with specific facial responses during empathy for pain. Brain Res. 2008, 1227, 153–161. [Google Scholar] [CrossRef]
  22. Grèzes, J.; Philip, L.; Chadwick, M.; Dezecache, G.; Soussignan, R.; Conty, L. Self-relevance appraisal influences facial reactions to emotional body expressions. PLoS ONE 2013, 8, e55885. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Lane, R.D.; Schwartz, G.E. Levels of emotional awareness: A cognitive-developmental theory and its application to psychopathology. Am. J. Psychiatry 1987, 144, 133–143. [Google Scholar] [PubMed]
  24. Jessimer, M.; Markham, R. Alexithymia: A right hemisphere dysfunction specific to recognition of certain facial expressions? Brain Cogn. 1997, 34, 246–258. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  25. Lane, R.D.; Sechrest, L.; Reidel, R.; Weldon, V.; Kaszniak, A.; Schwartz, G.E. Impaired verbal and nonverbal emotion recognition in alexithymia. Psychosom. Med. 1996, 58, 203–210. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Reker, M.; Ohrmann, P.; Rauch, A.V.; Kugel, H.; Bauer, J.; Dannlowski, U.; Suslow, T. Individual differences in alexithymia and brain response to masked emotion faces. Cortex 2010, 46, 658–667. [Google Scholar] [CrossRef]
  27. Dan-Glauser, E.S.; Scherer, K.R. The Geneva affective picture database (GAPED): A new 730-picture database focusing on valence and normative significance. Behav. Res. Methods 2011, 43, 468–477. [Google Scholar] [CrossRef] [PubMed]
  28. Lang, P.; Bradley, M.; Cuthbert, B. International Affective Picture System (IAPS): Instruction Manual and Affective Ratings; The Center for Research in Psychophysiology, University of Florida: Gainesville, FL, USA, 2005. [Google Scholar]
  29. Walla, P.; Duregger, C.; Greiner, K.; Thurner, S.; Ehrenberger, K. Multiple aspects related to self-awareness and the awareness of others: An electroencephalography study. J. Neural Transm. 2008, 115, 983–992. [Google Scholar] [CrossRef]
  30. Walla, P.; Greiner, K.; Duregger, C.; Deecke, L.; Thurner, S. Self-awareness and the subconscious effect of personal pronouns on word encoding: A magnetoencephalography (MEG) study. Neuropsychologia 2007, 45, 796–809. [Google Scholar] [CrossRef]
  31. Zhou, A.; Shi, Z.; Zhang, P.; Liu, P.; Han, W.; Wu, H.; Xia, R. An ERP study on the effect of self-relevant possessive pronoun. Neurosci. Lett. 2010, 480, 162–166. [Google Scholar] [CrossRef]
  32. Bradley, M.; Lang, P. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
  33. Bagby, R.M.; Parker, J.D.A.; Taylor, G.J. The twenty-item Toronto Alexithymia Scale—I. Item selection and cross-validation of the factor structure. J. Psychosom. Res. 1994, 38, 23–32. [Google Scholar] [CrossRef]
  34. Bagby, R.M.; Taylor, G.J.; Parker, J.D.A. The twenty-item Toronto Alexithymia Scale—II. Convergent, discriminant, and concurrent validity. J. Psychosom. Res. 1994, 38, 33–40. [Google Scholar] [CrossRef]
  35. Fujimura, T.; Sato, W.; Suzuki, N. Facial expression arousal level modulates facial mimicry. Int. J. Psychophysiol. 2010, 76, 88–92. [Google Scholar] [CrossRef] [PubMed]
  36. Sato, W.; Fujimura, T.; Suzuki, N. Enhanced facial EMG activity in response to dynamic facial expressions. Int. J. Psychophysiol. 2008, 70, 70–74. [Google Scholar] [CrossRef]
  37. Sonnby-Borgström, M. Alexithymia as related to facial imitation, mentalization, empathy, and internal working models-of-self and -others. Neuropsychoanalysis 2014, 11, 111–128. [Google Scholar] [CrossRef]
  38. Montag, C.; Koller, P.W.M. Carpe diem instead of losing your social mind: Beyond digital addiction and why we all suffer from digital overuse. Cogent Psychol. 2016, 3, 1157281. [Google Scholar] [CrossRef]
  39. Van Der Velde, J.; Servaas, M.N.; Goerlich, K.S.; Bruggeman, R.; Horton, P.; Costafreda, S.G.; Aleman, A. Neural correlates of alexithymia: A meta-analysis of emotion processing studies. Neurosci. Biobehav. Rev. 2013, 37, 1774–1785. [Google Scholar] [CrossRef]
  40. Korb, S.; Grandjean, D.; Scherer, K. Investigating the production of emotional facial expressions: A combined electroencephalographic (EEG) and electromyographic (EMG) approach. In Proceedings of the 2008 8th IEEE International Conference on Automatic Face and Gesture Recognition, Amsterdam, The Netherlands, 17–19 September 2008. [Google Scholar]
  41. Meriau, K.; Wartenburger, I.; Kazzer, P.; Prehn, K.; Lammers, C.H.; van der Meer, E.; Heekeren, H.R. A neural network reflecting individual differences in cognitive processing of emotions during perceptual decision making. Neuroimage 2006, 33, 1016–1027. [Google Scholar] [CrossRef] [PubMed]
  42. Troisi, A.; Belsanti, S.; Bucci, A.R.; Mosco, C.; Sinti, F.; Verucci, M. Affect regulation in alexithymia: An ethological study of displacement behavior during psychiatric interviews. J. Nerv. Ment. Dis. 2000, 188, 13–18. [Google Scholar] [CrossRef]
  43. Wagner, H.; Lee, V. Alexithymia and individual differences in emotional expression. J. Res. Personal. 2008, 42, 83–95. [Google Scholar] [CrossRef]
  44. Walla, P.; Herbert, C. Hierarchy and dynamics of self-referential processing: The non-personal Me1 and the personal Me2 elicited via single words. Cogent Psychol. 2015, 2. [Google Scholar] [CrossRef]
Figure 1. Trial sequence of events. In addition, a 200 ms inter-stimulus interval consisting of a black screen was displayed between each trial event.
Figure 1. Trial sequence of events. In addition, a 200 ms inter-stimulus interval consisting of a black screen was displayed between each trial event.
Psych 03 00006 g001
Figure 2. Mean corrugator activity for the first second of picture viewing. Means represent the average of all 250 ms time intervals for the respective conditions. Means were collapsed over emotion categories to show the cumulative effect of the referential conditions (smaller bar graph) and the effect of emotion awareness on referential conditions. Error bars represent 1 standard error. Asterisks mark variables with significant results.
Figure 2. Mean corrugator activity for the first second of picture viewing. Means represent the average of all 250 ms time intervals for the respective conditions. Means were collapsed over emotion categories to show the cumulative effect of the referential conditions (smaller bar graph) and the effect of emotion awareness on referential conditions. Error bars represent 1 standard error. Asterisks mark variables with significant results.
Psych 03 00006 g002
Figure 3. Mean corrugator activity at four time intervals within the first second of picture viewing. The mean for each time interval represents the average of all data points within the 250 ms interval. Means were collapsed over referential conditions to show the cumulative effect of each emotional picture category. Asterisks mark variables with significant results.
Figure 3. Mean corrugator activity at four time intervals within the first second of picture viewing. The mean for each time interval represents the average of all data points within the 250 ms interval. Means were collapsed over referential conditions to show the cumulative effect of each emotional picture category. Asterisks mark variables with significant results.
Psych 03 00006 g003
Figure 4. Mean pleasantness ratings for the final collection of stimuli. Pleasantness was rated on a scale of 1 (very unpleasant) to 9 (very pleasant). Error bars represent one standard error of the mean.* = the differences are significant at 0.05 alpha level.
Figure 4. Mean pleasantness ratings for the final collection of stimuli. Pleasantness was rated on a scale of 1 (very unpleasant) to 9 (very pleasant). Error bars represent one standard error of the mean.* = the differences are significant at 0.05 alpha level.
Psych 03 00006 g004
Figure 5. Mean arousal ratings for the final collection of stimuli. Arousal was rated on a scale of 1 (very calm) to 9 (very arousing). Error bars represent one standard error of the mean.* = the differences are significant at 0.05 alpha level.
Figure 5. Mean arousal ratings for the final collection of stimuli. Arousal was rated on a scale of 1 (very calm) to 9 (very arousing). Error bars represent one standard error of the mean.* = the differences are significant at 0.05 alpha level.
Psych 03 00006 g005
Table 1. Pre-evaluated valence and arousal ratings for the stimulus collection. Values for valence are based on the degree of reported pleasantness on a scale of 1-9, where 9 represents very pleasant and 1 represents very unpleasant. Values for arousal are based on the degree of reported arousal on a scale of 1-9, where 9 represents very arousing and 1 represents very calm.
Table 1. Pre-evaluated valence and arousal ratings for the stimulus collection. Values for valence are based on the degree of reported pleasantness on a scale of 1-9, where 9 represents very pleasant and 1 represents very unpleasant. Values for arousal are based on the degree of reported arousal on a scale of 1-9, where 9 represents very arousing and 1 represents very calm.
Valence (SD)Arousal (SD)
Unpleasant2.72 (.49)4.86 (.42)
Neutral5.04 (.36)4.66 (.92)
Pleasant7.29 (.50)4.82 (.59)
Table 2. Main effects for corrugator activity.
Table 2. Main effects for corrugator activity.
f (df)Pη2
Reference
Interval 14.39 (2,34).020.21
Interval 24.46 (2,34).019.21
Interval 33.92 (2,34).030.19
Interval 43.58 (2,34).039.17
Emotion Awareness
Interval 15.16 (1,17).036.23
Interval 25.30 (1,17).034.24
Interval 35.33 (1,17).034.24
Interval 45.30 (1,17).034.24
Reference*Emotion Awareness
Interval 14.02 (2,34).027.19
Interval 23.89 (2,34).030.19
Interval 34.18 (2,34).024.20
Interval 44.32 (2,34).021.20
Emotion
Interval 37.06 (1.47, 25.04).007.29
Interval 48.42 (1.39, 23.58).004.33
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Walla, P.; Mavratzakis, A. Associations between Cognitive Concepts of Self and Emotional Facial Expressions with an Emphasis on Emotion Awareness. Psych 2021, 3, 48-60. https://0-doi-org.brum.beds.ac.uk/10.3390/psych3020006

AMA Style

Walla P, Mavratzakis A. Associations between Cognitive Concepts of Self and Emotional Facial Expressions with an Emphasis on Emotion Awareness. Psych. 2021; 3(2):48-60. https://0-doi-org.brum.beds.ac.uk/10.3390/psych3020006

Chicago/Turabian Style

Walla, Peter, and Aimee Mavratzakis. 2021. "Associations between Cognitive Concepts of Self and Emotional Facial Expressions with an Emphasis on Emotion Awareness" Psych 3, no. 2: 48-60. https://0-doi-org.brum.beds.ac.uk/10.3390/psych3020006

Article Metrics

Back to TopTop