Next Article in Journal
Personal Values in Relation to Risk Intelligence: Evidence from a Multi-Mediation Model
Previous Article in Journal
Validation of the Japanese Version of the Center for Epidemiologic Studies Depression Scale—Revised: A Preliminary Analysis
Article

Understanding the Emotional Impact of GIFs on Instagram through Consumer Neuroscience

1
Faculty of Business and Communication, International University of La Rioja, 26006 Logroño, Spain
2
Departament of Organization and Marketing, Faculty of Economics & Business, Somosaguas Campus, Universidad Complutense de Madrid, 28223 Madrid, Spain
*
Author to whom correspondence should be addressed.
Academic Editors: Scott D. Lane and Ana Reyes-Menendez
Received: 31 May 2021 / Revised: 15 July 2021 / Accepted: 27 July 2021 / Published: 30 July 2021

Abstract

The ability of GIFs to generate emotionality in social media marketing strategies is analyzed. The aim of this work is to show how neuroscience research techniques can be integrated into the analysis of emotions, improving the results and helping to guide actions in social networks. This research is structured in two phases: an experimental study using automated biometric analysis (facial coding, GSR and eye tracking) and an analysis of declared feelings in the comments of Instagram users. Explicit valence, type of emotion, length of comment and proportion of emojis are extracted. The results indicate that the explicit measure of emotional valence shows a higher and more positive emotional level than the implicit one. This difference is influenced differently by the engagement and the proportion of emojis in the comment. A further step has been taken in the measurement of user emotionality in social media campaigns, including not only content analysis, but also providing new insights thanks to neuromarketing.
Keywords: social networks; digital consumer behavior; emotion; Instagram; GIF; consumer neuroscience; neuromarketing; skin conductance; facial coding; eye tracking; sentiment analysis social networks; digital consumer behavior; emotion; Instagram; GIF; consumer neuroscience; neuromarketing; skin conductance; facial coding; eye tracking; sentiment analysis

1. Introduction

Currently, social media marketing strategies seek to position brands within the hearts of their customers, where the main experience of value is emotion [1,2] (Smith and Bolton, 2002; Mauri, et al., 2011). This is why the involvement of the senses is fundamental to influencing the emotional state of social media users [3] (Prescott, 2017). One of the most widely used resources to achieve this emotional impact on users is GIFs. Their effectiveness has been analyzed in several studies that have shown their ability to generate emotionality [4,5,6] (Bourlai and Herring, 2014; Bakhshi et al., 2016; Gygli and Soleymani, 2016).
In fact, nowadays, communication and marketing professionals are looking for tools that allow them to measure the effectiveness of their campaigns, in terms of emotionality. The most frequently used research techniques are based on content analyses of comments made by a brand’s followers on a social network [7,8,9,10] (Driscoll, 2015; Turnbull and Jenkins, 2016; Scheinbaum, 2017; Kim and Kim, 2018). Less common is the use of neuroscience techniques to measure emotionality based on users’ unconscious responses to a given stimulus.
Consequently, the present study proposes a combination of emotion measures (emotional valence, basic emotions and engagement) to assess the effectiveness of GIFs as generators of emotional experiences on social networks. In addition, neuroscience techniques were used to observe physiological and cognitive responses (implicit measures), as well as to perform sentiment analyses (explicit measures) of Instagram comments.

2. Literature Background

2.1. Conceptualization of Emotion Assessment

There are a large number of definitions of the concept of emotion in existent literature, as well as of existing emotional states, the ways to measure them, and their neurophysiological representations. In this investigation, emotion is understood as the cognitive process of evaluating and interpreting feelings, with the aim of regulating social and/or relational responses in social networks. To address emotion, there are two traditional measurement perspectives in the field of psychology. The first is the dimensional measurement of emotion, which states that an emotion is composed of valence and arousal. Emotional valence is the positive or negative evaluation of the emotional state, while arousal (The construct “arousal” is a hypothetical term that describes the processes that control alertness, wakefulness and activation. (Anderson, 1990) [11] or physiological arousal refers to the activation of the parasympathetic nervous system (e.g., increased skin sweating or heart rate) [12,13,14] (Harmon-Jones et al., 2017; Izard, 2010; Lang, 1995). The second is the measurement of emotion as a discrete entity. In this case, the emotional evaluation process results in concrete emotions, such as happiness or sadness [12] (Harmon-Jones et al., 2017). Specifically, six basic and universal emotions have been identified: happiness, surprise, fear, anger, disgust and sadness [15] (Ekman, 1993).
Therefore, this emotionality underlies psychological and physiological responses [12] (Harmon-Jones et al., 2017). This is stated since emotions are normally triggered by a stimulus that is perceived or remembered, provoking physiological actions, such as the contractions of certain facial muscles [16] (Damasio and Carvalho, 2013). Moreover, authors such as LeDoux and Brown (2017) [17] have investigated which brain circuits activate a specific emotion and allow us to be aware of it and express it verbally. Consequently, it is necessary to combine its explicit (textual) study with tools that allow us to capture the most implicit part of the emotion (neuroscientific and biometric tools).

2.2. Sentiment Analysis and Emotional Engagement in Social Media

Several studies have been devoted to the analysis of sentiment in social networks. Among the most relevant is that of Driscoll B. (2015) [7], who studied sentiment in 20,189 tweets and 921 replies, concluding that 38% of these replies express a positive emotion while 20% express a negative emotion. This also highlights the importance of emotion and its link to perceived intimacy between senders and receivers. For Turnbull and Jenkins (2016) [8], social media reactions offer marketers the opportunity to better understand how consumers engage emotionally with social media content, enabling greater precision in their emotional response. This allows brands to more effectively measure their campaigns.
Social media, and Instagram in particular, are experiential products that continuously reinforce both positive and negative habits [9] (Scheinbaum, 2017). Several studies point to the bias of this online positivity, as most content distributed on social networks is rated more positively than negatively [18,19] (Reinecke and Trepte, 2014; Waterloo, Baumgartner, Peter and Valkenburg, 2018). In the context of emotions and positivity on the internet, there is research that advocates the expression of emotions through networks, from which a direct link between emotional language and online behavior can be inferred [20,21] (Dresner and Herring, 2010; Huffaker, 2010).
Following this last idea, the link between behavior and expressed emotion has been seen in studies that have investigated the emotional state and the engagement in social networks. Most notably, Dubovi and Tabac (2021) [22] have tried to determine whether the behavioral engagement of views, likes, dislikes and comments, and the emotional and cognitive engagement in science dissemination channels on YouTube, coincide or not. They show in their study that, regardless of the valence of emotional engagement, emotion is linked to higher behavioral engagement in posting comments and to higher cognitive engagement in argumentative deliberation. Morgado et al. (2020) [23] studied the emotional engagement of users on the police’s Facebook profile, concluding that the overall engagement is positive and that it mainly came from women. In contrast, Vizcaíno and Aguaded (2020) [24] study the emotional poralization of children on Instagram accounts. Their results reveal a prominent positivity and subjectivity in the lexical field, with the repeated use of adjectives such as “happy”, “new” or “super”. On the other hand, Kim and Kim (2018) [10] conducted explorations on computer vision techniques on Instagram to define associations between personality and gender by means of photography. Their results show that users’ extroversion, agreeableness and openness were partly associated with the emotions expressed in faces in their photos, specifically among certain pixel traits. It was also observed that the big five personality traits can be predicted by the above variables, except in the case of gender. Claffey and Brady (2019) [25] empirically tested hypotheses on the effects of key components of consumer engagement (cognitive appraisal, affective states, participation) on consumers’ affective engagement. Zhan, Tu and Yu (2018) [26] performed a sentiment analysis on Instagram of library readers by identifying three polarities of opinion (negative, neutral and positive) and six emotions through comments (scary, loser, upset, enjoyable, happy and fun). These polarities provide new insights into understanding readers, which helps libraries provide better services. Diayanah-Abdullah and Asnira-Zolkepi (2017) [27] analyzed users’ emotions towards brands on social media, and their results show that the feeling of provocation must be managed efficiently to start interaction and a long-term relationship. Finally, Domingo, Jewitt and Kress (2015) [28] stated that, on Instagram, writing is an intrinsic part of the visual element, hence the importance of analyzing the emotional valences of the comments posted by users of the network.
Therefore, the literature found reflects the relevance that the analysis of content has had and continues to have today, in terms of the emotions it reflects and the emotional engagement that can be obtained through social networks.

2.3. Neuroscience at the Service of the Study of Emotions: Implicit and Explicit Measures

The measure of emotional valence is being used as an indicator of the success of social media communication campaigns. This measure is often obtained from automated emotional analyses of user comments, which are considered an explicit measure as they are self-expressed in the form of texts and emojis [7,8,22,29] (Kralj et al., 2015; Driscoll, 2015; Turnbull and Jenkins, 2016; Dubovi and Tabac, 2021). However, neuromarketing techniques can also be used to analyze the unstated (implicit) responses of the target audience. In these cases, the most common way to obtain emotional valence is through the use of technological tools that take biometric measurements, such as skin conductance or facial micro-expressions. However, neuroscientific techniques, such as electroencephalograms (EEG), can also record the measurement of emotional valence.
Although there is little empirical evidence, some studies can be found that have used neuroscience tools for the analysis of emotionality in social media. These include the study by Harris, Ciorciari and Gountas (2019) [30], which analyzes social media strategies based on action/challenge/emotion, showing the value of combining neuroscientific techniques (EEG) with traditional market research methods (psychometric survey).
Relevant studies have used a combination of measures of emotional valence to investigate different stimuli related to marketing communications. There is one study, focusing on aesthetic and utilitarian emotions in response to advertisements, that combines neuroscience research techniques (facial electromyography and skin conductance) with the study of subjective self-evaluations of emotion (Lajante et al., 2020) [31]. In the context of social media, a study has been carried out comparing two measures of emotional valence: that obtained from psychophysiological responses, and that resulting from the analysis of user comments. This study determines the existence of significant differences in unconscious and verbalized responses (Hernández-Fernández, Mora and Hernández, 2019) [32]. From the field of computer science, a study is conducted that records the physiological reactions and verbalized responses of e-game users to evaluate their experiences. Researchers recognize the potential of physiological analysis to enrich research in entertainment technology (Mandryk, Inkpen & Calvert, 2006) [33].
This shows the potential of combining neuroscientific and biometric devices with self-reported measures in the study of social networks.

2.4. The Use of GIFs in Social Media

GIFs have become culturally relevant in the digital context, especially in social media. They are a good tool for sensory appeal through the use of movement, color and repetition (Ash, 2015) [34]. They are considered a suitable resource to generate emotionality, as they can represent a wide variety of feelings (Bourlai and Herring, 2014) [4]. In addition, due to their simplicity and high number of meanings, they manage to arouse empathy with the content shown (Miltner and Highfield, 2017) [35]. These qualities lead brands to use GIFs to design experiences with affective qualities (Gürsimsek, 2016). [36]
Previous studies have investigated the use of GIFs in social media, showing interesting results. According to one of them, GIFs are the most attractive resource on Tumblr in terms of likes and reblogs (Bakhshi et al., 2016) [5]. Another study stated that the object that appears and the associated emotions are more important than the movement. It also concluded that the interest generated is associated with the number of likes the GIF receives, but does not correlate with reblogging it (Gygli and Soleymani, 2016) [6]. In fact, there is an interesting line of research that focuses on designing an affective computing tool for the automated analysis of the emotions represented in GIFs based on the facial expressions they contain (Brendan, Bhattacharya, and Chang, 2014; Chen, Rudovic and Picard, 2017) [37], [38]. Finally, Rua-Hidalgo et al. (2021) [39] conducted a two-phase study on GIFs used by commercial brands. In the first phase, they combined the biometric tools of automated observation of facial expressions, skin conductance and eye position to observe the emotional state that GIFs of well-known brands cause in participants. Furthermore, they compared them with the effects caused by static images of the same brands, concluding that GIFs achieve user engagement and cause a “state of well-being and pleasure” (Russell’s Circumplex Model, 1980) [40]. In the second phase, they used the implicit association test to observe unconscious associations related to well-known brands, and the results obtained show that participants believed that well-known brands are quality brands. The correlation found between the results of the two studies reveals that GIFs, while arousing positive emotions and leading to engagement, do not achieve an enthusiastic state when brands are internalized as quality brands.
In this way, GIFs are an attractive option when it comes to generating emotions, which can translate into higher conversions on social networks. For this reason, addressing how they can be effective in communicative terms, combining explicit and implicit measures can provide information that has not been explored so far.

3. Research Questions and Hypotheses

The following are the research questions and hypotheses derived from previous literature.
RQ1. 
Are there differences between implicit and explicit measures of emotional valence in Instagram users?
Hypothesis 1.
The measure of explicit valence of Instagram users’ response to GIFs will be more positive than the measure of implicit valence.
RQ2. 
What might account for these possible differences?
Hypothesis 2.
The greater the user engagement with GIFs, the smaller the difference between implicit and explicit measures of emotional valence.
Hypothesis 3.
The longer the comment, the smaller the difference between implicit and explicit measures of emotional valence.
Hypothesis 4.
The greater the proportion of emojis in the comments on GIFs, the greater the difference between implicit and explicit measures of emotional valence.
RQ3. 
Are explicit comments and biometric tools equally effective in identifying the emotions felt by subjects?
Hypothesis 5.
Biometric tools are more accurate predictors for assessing the basic type of emotion that is aroused in response to GIFs.
In order to test these hypotheses, this research was divided into two phases focused on inferring the emotional valence derived from a selection of GIFs posted on Instagram (Figure 1). In the first phase, techniques from consumer neuroscience or neuromarketing were applied to obtain a measure of implicit emotional valence and engagement. The second phase complemented the first by analyzing the explicit emotional valence based on the semantic analysis of the comments on each GIF. Finally, a comparison was made between the emotional data obtained implicitly and explicitly.

4. Materials, Methods and Results

4.1. Phase 1. Experimental Study of Neuromarketing Applied to GIFs

In this first phase, neuromarketing devices (face coder, GSR and eye tracker) are applied to analyze and quantify the emotional valence, the engagement generated and the type of basic emotion caused by 18 Instagram GIFs selected for being used by renowned brands and for having a high number of likes. The selected GIFs are images in movement with an approximate duration of 4 s. All the images are high-quality, some taken outdoors and others indoors. Concerning audiovisual treatment, some contain a filmed scene and others constantly repeat a moving image. The visual contents are varied: people, objects and animals, both in the foreground and in the background.
The valence variable allows us to identify the sign of the emotion (positive or negative). The engagement variable indicates the emotional state the person is in when viewing the stimulus, and is extracted through the combination of emotional valence and activation. The type of basic emotion recorded indicates which specific emotions have been experienced through the recording of facial expressions (happiness, surprise, anger, disgust, fear and sadness).
The three variables provide the measure for each subject and each GIF, and these individual data are added to the aggregate values for the group, thus obtaining a quantitative value for each variable for each GIF.

4.1.1. Participants

Participants were selected randomly, with the inclusion criterion being regular use of at least one social network.
The sample size was 30 participants. This sample size was considered representative with a probability of error of less than 1% [41,42,43] (Cohen, 1992; Sands, 2009; Hensel et al., 2017). The size is adequate to provide sufficient knowledge about the stimulus for the purpose of the research. Furthermore, the distribution of the sample was based on the Interactive Advertising Bureau and the Elogia study (2018) [44], which described the profiles of social network users. Therefore, the composition was 33% for each age range (16–30; 31–45; 46–55), with 53% women and 47% men.

4.1.2. Stimuli

The stimulus used in this experimental phase was a video of 2 min and 26 s, made from a random combination of 18 GIFs, incorporating distractors between each of them. In order to avoid presentation bias, three different videos were edited with the elements randomly ordered.
The selection of the GIFs was based on a review of all GIFs posted on Instagram (1 March–10 April 2019) by the top 100 international brands (2017 Best Global Brands ranking—Interbrand, 2017 [45]). The 18 GIFs with the highest number of likes were selected.
All subjects viewed one of the three versions of the video. During the presentation of the stimulus, facial micro-expressions, skin conductance level and pupil direction were recorded.

4.1.3. Devices

The GIFs were presented via a laptop (Windows 10 operating system) and a 24-inch monitor. In addition, an external webcam was used to record facial expressions and pupil direction.

4.1.4. Measuring Tools

Face coder. This records the data from the decoding of the person’s face through software that analyzes the image provided by the webcam. The software developed by the company INTERACTÚA+ was used.
The Face Coder tool provided six concrete emotions (joy, surprise, sadness, fear, anger and disgust), with their respective value, recorded from facial micro-expressions that were generated in the presence of each GIF. Subsequently, the values of these emotions registered in all participants were aggregated, obtaining a single value for each emotion per GIF.
Previous research in communication and marketing stimuli using this tool for the analysis of emotion type has been considered as the empirical background (McDuff, El Kaliouby and Picard, 2012; Bellman, Wooley and Varan, 2016; Goyal and Singh, 2018; Mundel et al., 2018) [46,47,48,49].
Galvanic Skin Response (GSR). This records skin conductance levels through electrodes placed on the fingers of the subject’s hand. The eSense® device was used.
Previous studies involving GSR to analyze unconscious responses to marketing stimuli were taken into account in the design (Weibel et al., 2019; Walla, Koller, Brenner and Bosshard, 2017; Guerreiro, Rita and Trigueiros, 2015; Reimann, Castano, Zaichkowsky and Bechara; 2012) [50,51,52,53].
Eye-tracker. This has been included in the experimental study to ensure that the participant is viewing the stimulus. It consists of a device that tracks the eye and records the subject’s gaze while viewing a stimulus.

4.1.5. Data Analysis

All recorded data were first processed in Excel in order to clean them and make them manageable in the SPSS statistical software.
Specifically, the data recorded by the facial emotion identification tool (face coder) determine the valence of the emotions (positive or negative) and the type of emotion according to the classification of the six basic universal emotions (Ekman, 1993) [14]. The recording of the level of skin conductance (GSR) during the visualization of the stimulus was used to obtain the level of arousal. From these two records, the engagement variable was calculated, which indicates the emotional state in which the participants found themselves while they were watching the video, as it combines the type of emotion with the intensity of this emotion. In addition, as a control measure, the position of the pupil (eye tracking) was monitored, which made it possible to verify the stimulus that was really generating the emotion.
Emotional valence and engagement were tested for normality and homogeneity of variances. The basic emotion type was quantified to get an approximation of how much each emotion was experienced upon exposure to each GIF.

4.1.6. Ethical Issues

The study was approved on 20 April 2018 by the Ethics Committee of the Universidad Internacional de la Rioja, Spain, under the number LABNMKT-001-2018.
The ethical protocols of the World Medical Association Declaration of Helsinki (1964) were followed. In addition, informed consent was obtained from all participants, assuring data confidentiality.

4.1.7. Procedure

After signing the informed consent form, each participant was placed in front of the computer. Both the face coder and the GSR were checked to ensure that they were calibrated and working correctly, and the baseline measurement was taken. Then, one of the three videos with the 18 stimuli was randomly played. Once the video had finished, they were thanked for their participation and accompanied to the exit. Each test lasted about 6 min in total.
In order to obtain the baseline of the GSR tool, sensors were placed on the non-dominant hand of the participants in such a way that it began to collect values prior to the presentation of the stimulus, thus establishing the skin conductance baseline of each participant. At the same time, the computer equipment and the tool software were synchronized. On the other hand, the eye tracker and Face Coder tools shared the same software and implementation. The first step was to regulate the webcam to achieve optimal lighting conditions and get the face of the participant centered on the screen. Then, the calibration software allowed us to obtain the baseline. This involved following a point that moved across the screen and picking up the point where the pupil is located.

4.1.8. Results

The Shapiro–Wilk normality test (see Table 1) shows that the variables implicit emotional valence (VIE) and engagement (Eg) are close to normal (VIE: p = 0.779 > 0.050; Eg: p = 0.772 > 0.050). Furthermore, their variances are homogeneous, according to the results obtained in Levene’s test of equality of variances (VIE: p = 0.873 > 0.050; Eg: p = 0.818 > 0.050). The measures of the variables analyzed are shown in Table 1.
The 18 GIFs selected reflect the six basic emotions. The emotion that appears most frequently is sadness, followed by anger, as shown in Table 2.

4.2. Phase 2. Sentiment Analysis of GIF Content on Instagram

In this phase, a quantitative methodology was implemented and a content analysis was carried out, using data mining and sentiment analysis. To do this, the comments on the Instagram GIFs selected in Phase 1 were obtained.
The explicit emotional valence of the comments was quantified (negative or positive), both the one derived from the text and the one derived from the emojis in the comments (negative and positive), as well as the length of the comment itself (including both text and emojis) and the proportion of emojis in the comments.

4.2.1. Stimuli

A total of 1420 comments were analyzed. These came from each of the 18 GIFs used in Phase 1. Up to 100 comments were extracted from each GIF (15 April–1 May 2021). If they had less than 100 comments, all existing comments were extracted. Similarly, if they had more than 100 comments, only the first 100 were used.

4.2.2. Measuring Tools

Sentiment analysis of comments. To obtain information on the length and composition of the comments, a program written in Python 3 was used. This program takes as input an excel document, generated through the Export Comments application, with the comments on each GIF (the openpyxl library was used). The comments were cleaned up using a function that removes strange punctuation symbols and tags that correspond to other users. The Emoji library was used to identify the emojis present in the comments. Then, a function was created to obtain the number of words, the average lengths of the comments, and the percentage of words and emojis in the comment. With all the information obtained, a new Excel document was generated (using openpyxl) that includes the following statistics: average number of words in the comments, average number of emojis and percentage of emojis versus words for each GIF.
The Twinword API (https://www.twinword.com/; accessed on 31 May 2021) was used. This natural language processing (NLP) interface, based on the Python programming language, detects the intentionality of sentences and paragraphs. The measure it provides is the overall score of the analyzed text. Thus, values lower than −0.05 are considered negative, values higher than 0.05 are considered positive, and all values in between are considered neutral. In addition, thanks to this application, a list of the most emotionally charged words and adjectives was obtained from the comments.
Detection of the basic emotions in the comments. Through Twinword’s Emotion Analysis API, the basic emotions in each of the 18 GIFs were identified.
Analysis of the emotionality of emojis. The emotional valence of emojis was obtained using the Emoji Sentiment Ranking (http://kt.ijs.si/data/Emoji_sentiment_ranking/; accessed on 31 May 2021), developed and validated by Kralj et al. (2015) [29].
Content analysis. We developed our own counting application to obtain the maximum length of the comments, as well as their composition (proportion of text and emojis).

4.2.3. Procedure and Data Analysis

(1)
Compilation of comments on each GIF
Using the URLs of the Instagram GIFs, up to 100 comments were collected for each of the 18 GIFs under study. Using Export Comments (https://exportcomments.com/; accessed on 31 May 2021), an Excel file was obtained for each GIF. If it had more than 100 comments, the tool extracted 100; if this number was lower, all comments were extracted.
Subsequently, all comments were translated into English as the sentiment analysis tool only evaluates content in this language. This was done via Google Translator (https://translate.google.com/?hl=fr&tab=TT; accessed on 31 May 2021).
(2)
Analysis of the explicit emotionality of each GIF and calculation of the variable VEE (Table 3)
The analysis of the emotionality of the textual content was carried out with Twinword. An emotion value was obtained for each comment in the GIF, calculated by averaging the total emotion value of each GIF, which was in the range −1 to 1.
The emotional valence of the emojis was then obtained by means of the Emoji Sentiment Ranking.
In order to obtain one single standardized value for the explicit valence measure, all text and emoji values were taken together and divided by the number of comments. This resulted in the variable Explicit Emotional Valence (VEE).
(3)
Analysis of the Composition of Comments and Calculation of Variables LgC and Pemj (Table 3)
In order to count the length (LgC) and the proportion of emojis in the total content of the comments (Pemj), we used our own counting application, designed in Python language. Thus, we obtained the average value of the length of the comments in each GIF and the percentage of emojis used in their comments.
(4)
Analysis of the Differences between Explicit and Implicit Emotionality and Calculation of the Variable VD (Table 3)
The variable Emotional Valence Difference (EVD) was obtained by the difference between the values recorded explicitly (comments) and implicitly (biometric tools):
VD = VEE − VIE

4.2.4. Results

Implicit Measure of Valence Versus Explicit Measure of Valence

The first step was to calculate normality for the VIE and VEE through the Shapiro–Wilk test (Table 4). The distributions were found to approach normality for both VIE and VEE (p > 0.050). In addition, the assumptions of homogeneity of variance were met using Levene’s test for equality of variances (p = 0.873 > 0.05).
Since the normality and homoscedasticity criteria were met, a t-test for independent samples was performed to compare the average of implicit valence and explicit valence. The results, as shown in Table 5, reveal statistically significant differences between VIE (MVIE = 0.148, SD = 0.036) and VEE (MVEE = 0.407, SD = 0.030; t(34) = −7.178, p = 0.000). Therefore, emotional valence was significantly higher when users expressed themselves through comments on Instagram than when the emotional valence aroused by GIFs was recorded through biometric tools. Consequently, H1 is confirmed. Moreover, in both cases the average emotional valence was positive.

Differences between Implicit and Explicit Measures of Emotional Valence for Each GIF

A new continuous variable is defined: the difference of the emotional valence between the implicit and the declared measures in each GIF (VD). This value was calculated for each GIF. Additionally, in order to obtain which variables could explain this difference between the two measures of valence, different correlations between different variables were studied through Pearson’s bivariate correlation coefficient. Specifically, these variables were engagement (Eg), comment length (LgC) and proportion of emojis in the comments of the GIFs (Pemj).
(1)
Relation between Valence Difference (VD) and Engagement (Eg)
In the relation between VD and Eg, the correlation showed a statistically significant inverse relation with a negative sign, as shown in Table 6 (r = −0.546, p > 0.050). This indicates that the more engagement a GIF generates, the smaller the difference in emotional valence measured explicitly (through comments) and implicitly (biometric tools), thus confirming H2.
(2)
Relation between Valence Difference (VD) and Comment Length (LgC)
A possible significant and positive relation between VD and LgC has been hypothesized. However, the results indicate that there was no statistically significant relation between VD and LgC (r = −0.380, p = 0.120), rejecting H3. Therefore, the greater difference found between the psychophysiological and the stated measure of valence was not due to the length of the comment.
(3)
Relation between the Valence Difference (VD) and the Proportion of Emojis (Pemj)
With respect to the relationship between VD and Pemj, as hypothesized, a significant and positive correlation was found (r = 0.631, p > 0.010), as shown in Table 7. This indicates that the greater the proportion of emojis used in comments, the greater the difference that will be found between the psychophysiological and the stated measure of valence. Therefore, H4 was accepted.
Comparison between the Emotions Generated by the GIFs According to Whether They Were Measured Implicitly or Explicitly.
In order to find out which type of measurement is more effective as a predictor, understanding effectiveness in this case in terms of precision, the specific emotions that gave rise to both implicit and explicit emotional valence were obtained (see Table 8).
As a result, while the biometric tools detected the six basic emotions (happiness, surprise, anger, disgust, fear and sadness) in the 18 selected GIFs, when analyzing the comments, it was not possible to detect all the emotions. Only the emotion happiness appeared in all 18 GIFs, while more negative emotions such as anger or disgust were not detected in any of them. In fact, most of the words used in the comments alluded to positivity (see Figure 2). Given that the biometric tools were more accurate in detecting the six emotions, we accept H5.

5. Discussion

The present study reflects the emotional impact that can be achieved through the use of GIFs in social networks [4,35,36], (Bourlai and Herring, 2014; Gürsimsek, 2016; Miltner and Highfield, 2017), specifically Instagram. Moreover, the use of different implicit and explicit techniques in the two phases of the study highlights the need to use biometric and self-reported tools in a complementary manner [31,32] (Hernández-Fernández, Mora and Hernández, 2019; Lajante et al., 2020) in order to eliminate biases—biases that undoubtedly occur if only the analysis of the comments declared by the subjects is taken into account.
By comparing implicit and explicit emotional valence, we find that social network users tend to express comments with a significantly higher (more positive) emotional charge than what they actually feel (before processing the stimulus rationally). Biometric tools can therefore help to provide a more accurate analysis.
This finding raises the question of what might account for the differences between the two valence measurements. Thus, the possible influences of engagement, the length of the comments, or the higher proportion of GIFs on the difference in valence are investigated.
As for the relation between the differences in valence and engagement, a significant inverse relation is evident. This indicates that the greater the engagement with the GIF, the smaller the difference between the valences. This relation can probably be explained by understanding engagement as commitment to the brand. This commitment could mean that users are less vulnerable to external influences, so that the emotions expressed would be more in line with what the subjects really feel towards the brand. In contrast, with a lower level of engagement, the emotions declared are under greater external influence: conditioning due to being a quality brand [39] (Rúa-Hidalgo et al., 2021), a sense of belonging, prestige, etc. In short, well-known brands need to surprise in order to activate users; when this does not happen, less engagement is generated (Rúa et al., 2021) [39]. However, it remains possible that the comments on social networks continue to be positive, moving even further away from the emotion that is really felt.
On the other hand, the length of the comment is not found to influence the differences between implicit and explicit measures of valence. This suggests that the emotionality of the words contained in the comment matters more than the length of the comment itself.
The proportion of emojis has not been found to be a reliable predictor when assessing the emotional level of the social network user. A higher proportion of emojis in comments generates a greater distortion between implicitly and explicitly measured valence. This could be due to the fact that text provides greater possibilities for expressing emotions felt (greater semantic richness), while with emojis the range of expression is smaller, and more limited in its emotional gradation.
Finally, in the comparison of the type of emotion recorded by biometric tools and in the declared comments, a higher number of emotions were detected when recording data with neuromarketing devices.
We suggest that, when there is public exposure, through comments on social networks users express more positive emotions (happiness and surprise), avoiding more negative ones such as anger or disgust [17,18] (Reinecke and Trepte, 2014; Waterloo, Baumgartner, Peter and Valkenburg, 2018). It is possible that the external influences to which we as social beings are subjected (need to belong, social judgement, status, etc.) are the cause. However, these influences disappear when the implicit emotion is recorded through neuromarketing tools, making it possible to identify a greater and more precise variety of emotions.
All of the above indicates that using the comments received on social networks as the only indicator of the level of success of a communication action can lead to a false sense of success for well-known brands.
It also shows that the implicit measurement of emotion is more effective, both in quantifying the emotional level of subjects and in identifying the basic type of emotion that subjects experience, thus achieving greater richness and precision in the analysis.

6. Implications of the Work

Biometric tools have proven to be very useful in specifying and settling the emotions perceived by users. This does not mean eliminating other traditional methods of analysis, but rather using them complementarily. In short, they highlight what the users are not even able to identify themselves.

7. Limitations and Future Lines of Research

A limitation of the research lies in the use of Google Translator to transcribe sentiment analysis comments into English. This basic translation tool can cause the context to be lost in translation; however, less than 11% of the comments needed to be translated, and in most cases, they were short, easily translated comments.
In addition, as the GIFs were always from well-known brands, the level of emotionality in the comments may be different from if they were used in the context of non-known brands.
It would be interesting to carry out future research to study whether the use of other types of stimuli in social networks (stories, photos or videos) can achieve similar results to those achieved with GIFs.

Author Contributions

Conceptualization, M.G.-C. and I.R.-H.; methodology, I.R.-H., I.A., M.G.-C.; software, I.R.-H.; validation, I.R.-H., I.A. and M.G.-C.; formal analysis, C.C.-R.; investigation, M.G.-C.; resources, M.G.-C., I.A. and C.C.-R.; data curation, I.R.-H.; writing—original draft preparation, I.R.-H., I.A., M.G.-C.; writing—review and editing, C.C.-R.; visualization, I.R.-H.; supervision, I.A.; project administration, M.G.-C.; funding acquisition, M.G.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Ethics Committee of UNIR Neuromarketing Laboratory (protocol code LABNMKT-001-2018; 20 April 2018).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Table 2 provides the links to the publicly archived data analysed in the study (last acces on 15 May 2021). GIF1; GIF2; GIF3; GIF4; GIF5; GIF6; GIF7; GIF8; GIF9; GIF10; GIF11; GIF12; GIF13; GIF14; GIF15; GIF16.

Acknowledgments

We would like to thank Interactúa+ for providing us with the software to implement the face coder and the eye tracker.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Smith, A.K.; Bolton, R.N. The effect of customers’ emotional responses to service failures on their recovery effort evaluations and satisfaction judgments. J. Acad. Mark. Sci. 2002, 30, 5–23. [Google Scholar] [CrossRef]
  2. Mauri, M.; Cipresso, P.; Balgera, A.; Villamira, M.; Riva, G. Why is Facebook so successful? Psychophysiological measures describe a core flow state while using Facebook. Cyberpsychol. Behav. Soc. Netw. 2011, 14, 723–731. [Google Scholar] [CrossRef] [PubMed]
  3. Prescott, J. Some considerations in the measurement of emotions in sensory and consumer research. Food Qual. Prefer. 2017, 62, 360–368. [Google Scholar] [CrossRef]
  4. Bourlai, E.; Herring, S.C. Multimodal Communication on Tumblr: “I Have So Many Feels!”. In Proceedings of the 2014 ACM Conference on Web Science, Bloomington, IN, USA, 23–26 June 2014; ACM: New York, NY, USA; pp. 171–175. [Google Scholar] [CrossRef]
  5. Bakhshi, S.; Shamma, D.A.; Kennedy, L.; Song, Y.; De Juan, P.; Kaye, J.J. Fast, cheap, and good: Why animated GIFs engage us. In Proceedings of the 2016 Chi Conference on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; pp. 575–586. [Google Scholar] [CrossRef]
  6. Gygli, M.; Soleymani, M. Analyzing and predicting GIF interestingness. In Proceedings of the 24th ACM International Conference on Multimedia, New York, NY, USA, 23–27 October 2006; pp. 122–126. [Google Scholar] [CrossRef]
  7. Driscoll, B. Sentiment analysis and the literary festival audience. Continuum 2015, 29, 861–873. [Google Scholar] [CrossRef]
  8. Turnbull, S.; Jenkins, S. Why Facebook Reactions are good news for evaluating social media campaigns. J. Directdata Digit. Mark. Pract. 2016, 17, 156–158. [Google Scholar] [CrossRef]
  9. Scheinbaum, A. The Dark Side of Social Media: A Consumer Psychology Perspective; Routledge: New York, NY, USA, 2017. [Google Scholar] [CrossRef]
  10. Kim, Y.; Kim, J.H. Using computer vision techniques on Instagram to link users’ personalities and genders to the features of their photos: An exploratory study. Inf. Process. Manag. 2018, 54, 1101–1114. [Google Scholar] [CrossRef]
  11. Anderson, K.J. Arousal and the Inverted-U Hypothesis: A Critique of Neiss’s “Reconceptualizing Arousal”. Psychol. Bull. 1990, 107, 96–100. [Google Scholar] [CrossRef]
  12. Harmon-Jones, E.; Harmon-Jones, C.; Summerell, E. On the Importance of Both Dimensional and Discrete Models of Emotion. Behav. Sci. 2017, 7, 66. [Google Scholar] [CrossRef]
  13. Izard, C.E. The many meanings/aspects of emotion: Definitions, functions, activation, and regulation. Emot. Rev. 2010, 2, 363–370. [Google Scholar] [CrossRef]
  14. Lang, P.J. The Emotion Probe: Studies of Motivation and Attention. Am. Psychol. 1995, 50, 372–385. [Google Scholar] [CrossRef]
  15. Ekman, P.; Freisen, W.V.; Ancoli, S. Facial signs of emotional experience. J. Personal. Soc. Psychol. 1980, 39, 1125–1134. [Google Scholar] [CrossRef]
  16. Damasio, A.; Carvalho, G.B. The nature of feelings: Evolutionary and neurobiological origins. Nat. Rev. Neurosci. 2013, 14, 143–152. [Google Scholar] [CrossRef]
  17. LeDoux, J.E.; Brown, R. A higher-order theory of emotional consciousness. Proc. Natl. Acad. Sci. USA 2017, 114, E2016–E2025. [Google Scholar] [CrossRef] [PubMed]
  18. Reinecke, L.; Trepte, S. Authenticity and well-being on social network sites: A two wave longitudinal study on the effects of online authenticity and the positivity bias in SNS communication. Comput. Hum. Behav. 2014, 30, 95–102. [Google Scholar] [CrossRef]
  19. Waterloo, S.F.; Baumgartner, S.E.; Peter, J.; Valkenburg, P.M. Norms of online expressions of emotion: Comparing Facebook, Twitter, Instagram, and Whatsapp. New Media Soc. 2018, 20, 1813–1831. [Google Scholar] [CrossRef]
  20. Dresner, E.; Herring, S.C. Functions of the nonverbal in CMC: Emoticons and 833 illocutionary force. Commun. Theory 2010, 20, 249–268. [Google Scholar] [CrossRef]
  21. Huffaker, D. Dimensions of leadership and social influence in online 853 communities. Hum. Commun. Res. 2010, 36, 593–617. [Google Scholar] [CrossRef]
  22. Dubovi, I.; Tabak, I. Interactions between emotional and cognitive engagement with science on YouTube. Public Underst. Sci. 2021, 0963662521990848. [Google Scholar] [CrossRef]
  23. Morgado, S.M.; Moniz, T.; Felgueiras, S. Facebook and Polícia de Segurança Pública: An exploratory study of follower’s engagement. In Marketing and Smart Technologies; Springer: Singapore, 2020; pp. 363–376. [Google Scholar]
  24. Vizcaino-Verdu, A.; Aguaded, I. Análisis de sentimiento en Instagram: Polaridad y subjetividad de cuentas infantiles. ZER Rev. Estud. Comun. Komunikazio Ikasketen Aldizka. 2020, 25. [Google Scholar] [CrossRef]
  25. Claffey, E.; Brady, M. An empirical study of the impact of consumer emotional engagement and affective commitment in firm-hosted virtual communities. J. Mark. Manag. 2019, 35, 1047–1079. [Google Scholar] [CrossRef]
  26. Zhan, M.; Tu, R.; Yu, Q. Understanding readers: Conducting sentiment analysis of Instagram captions. In Proceedings of the 2nd International Conference on Computer Science and Artificial Intelligence, Shenzhen China, 8–10 December 2018; Association for Computing Machinery: Shenzhen, China, 2018; pp. 33–40. [Google Scholar] [CrossRef]
  27. Diyanah-Abdullah, N.S.; Asnira-Zolkepli, I. Sentiment analysis of online crowd input towards brand provocation in Facebook, Twitter, and Instagram. In Proceedings of the International Conference on Big Data and Internet of Thing, London, UK, 20–22 December 2017; BDIOT: New York, NY, USA; pp. 67–74. [Google Scholar] [CrossRef]
  28. Domingo, M.; Jewitt, C.; Kress, G. Multimodal social semiotics: Writing in online contexts. In The Routledge Handbook of Literacy Studies 2015; Rowsell, E.J., Pahl, K., Eds.; Routledge: London, UK; pp. 251–266. Available online: https://bit.ly/2sMlbOE (accessed on 1 May 2021).
  29. Kralj Novak, P.; Smailović, J.; Sluban, B.; Mozetič, I. Sentiment of emojis. PLoS ONE 2015, 10, e0144296. [Google Scholar] [CrossRef] [PubMed]
  30. Harris, J.M.; Ciorciari, J.; Gountas, J. Consumer neuroscience and digital/social media health/social cause advertisement effectiveness. Behav. Sci. 2019, 9, 42. [Google Scholar] [CrossRef]
  31. Lajante, M.; Droulers, O.; Derbaix, C.; Poncin, I. Looking at aesthetic emotions in advertising research through a psychophysiological perspective. Front. Psychol. 2020, 11, 2544. [Google Scholar] [CrossRef] [PubMed]
  32. Hernández-Fernández, A.; Mora, E.; Hernández, M.I.V. When a new technological product launching fails: A multi-method approach of facial recognition and E-WOM sentiment analysis. Physiol. Behav. 2019, 200, 130–138. [Google Scholar] [CrossRef]
  33. Mandryk, R.L.; Inkpen, K.M.; Calvert, T.W. Using psychophysiological techniques to measure user experience with entertainment technologies. Behav. Inf. Technol. 2006, 25, 141–158. [Google Scholar] [CrossRef]
  34. Ash, J. Sensation, Networks, and the GIF: Toward an Allotropic Account of Affect. Networked Affect; The MIT Press: Cambridge, MA, USA, 2015; pp. 119–134. [Google Scholar]
  35. Miltner, K.; Highfield, T. Never Gonna GIF You Up: Analyzing the Cultural Significance of the Animated GIF. Soc. Media Soc. 2017, 3, 2056305117725223. [Google Scholar] [CrossRef]
  36. Gürsimsek, Ö. Animated GIFs as vernacular graphic design: Producing Tumblr blogs. Vis. Commun. 2016, 15, 329–349. [Google Scholar] [CrossRef]
  37. Jou, B.; Bhattacharya, S.; Chang, S. Predict-ing viewer perceived emotions in animated gifs. In Proceedings of the 22nd ACM International Conference on Multimedia, Orlando, FL, USA, 21–25 October 2014; pp. 213–216. [Google Scholar] [CrossRef]
  38. Chen, W.; Rudovic, O.O.; Picard, R.W. Gifgif+: Collecting emotional animated gifs with clustered multi-task learning. In Proceedings of the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017; pp. 510–517. [Google Scholar]
  39. Rúa Hidalgo, I.; Galmés Cerezo, M.; Espinosa Jarrín, M.D.C. El engagement y la sorpresa en la comunicación digital de las marcas: Estudio del impacto emocional de los GIFs en los usuarios de las redes sociales. Adresearch Rev. Int. Investig. Comun. 2021, 25, 26–43. [Google Scholar]
  40. Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
  41. Cohen, J. Statistical power analysis. Curr. Dir. Psychol. Sci. 1992, 1, 98–101. [Google Scholar] [CrossRef]
  42. Sands, S.F. Sample Size Analysis for Brainwave Collection (EEG) Methodologies. 2009. Available online: https://sandsresearch01.worldsecuresystems.com/assets/white-paper.pdf (accessed on 31 May 2021).
  43. Hensel, D.; Iorga, A.; Wolter, L.; Znanewitz, J. Conducting neuromarketing studies ethically-practitioner perspectives. Cogent Psychol. 2017, 4, 1320858. [Google Scholar] [CrossRef]
  44. Interactive Advertising Bureau y Elogia: Estudio Anual De Redes Sociales 2018. IAB Spain. Versión Reducida. Disponible en. Available online: https://iabspain.es/estudio/estudio-anual-de-redes-sociales-2018/ (accessed on 31 May 2021).
  45. Interbrand: Interbrand Releases 2017 Best Global Brands Report. 24 September 2017 Disponible en. Available online: https://www.interbrand.com/newsroom/bgb-report-2017/ (accessed on 31 May 2021).
  46. McDuff, D.; El Kaliouby, R.; Picard, R.W. Crowdsourcing facial responses to online videos. IEEE Trans. Affect. Comput. 2018, 3, 456–468. [Google Scholar] [CrossRef]
  47. Bellman, S.; Wooley, B.; Varan, D. Program-ad matching and television ad effectiveness: A reinquiry using facial tracking software. J. Advert. 2016, 45, 72–77. [Google Scholar] [CrossRef]
  48. Goyal, G.; Singh, J. Minimum Annotation identification of facial affects for Video Advertisement. In Proceedings of the International Conference on Intelligent Circuits and Systems (ICICS), Phagwara, India, 20–21 April 2018; pp. 300–305. [Google Scholar]
  49. Mundel, J.; Huddleston, P.; Behe, B.; Sage, L.; Latona, C. An eye tracking study of minimally branded products: Hedonism and branding as predictors of purchase intentions. J. Prod. Brand Manag. 2018, 27, 146–157. [Google Scholar] [CrossRef]
  50. Walla, P.; Koller, M.; Brenner, G.; Bosshard, S. Evaluative conditioning of established brands: Implicit measures reveal other effects than explicit measures. J. Neurosci. Psychol. Econ. 2017, 10, 24–41. [Google Scholar] [CrossRef]
  51. Weibel, D.; di Francesco, R.; Kopf, R.; Fahrni, S.; Brunner, A.; Kronenberg, P.; Wissmath, B. TV vs. YouTube: TV advertisements capture more visual attention, create more positive emotions and have a stronger impact on implicit long-term memory. Front. Psychol. 2019, 10, 626. [Google Scholar] [CrossRef]
  52. Guerreiro, J.; Rita P y Trigueiros, D. Attention, emotions and cause-related marketing effectiveness. Eur. J. Mark. 2015, 49, 1728–1750. [Google Scholar] [CrossRef]
  53. Reimann, M.; Castano, R.; Zaichkowsky, J.; Bechara, A. How we relate to brands: Psychological and neurophysiological insights into consumer-brand relationships. J. Consum. Psychol. 2012, 22, 128–142. [Google Scholar] [CrossRef]
Figure 1. Visualization of the methodology of phases 1 and 2.
Figure 1. Visualization of the methodology of phases 1 and 2.
Behavsci 11 00108 g001
Figure 2. Visualization of the most emotionally charged nouns and adjectives extracted from the totality of the GIF comments.
Figure 2. Visualization of the most emotionally charged nouns and adjectives extracted from the totality of the GIF comments.
Behavsci 11 00108 g002
Table 1. Statistic results of the variables in Phase 1.
Table 1. Statistic results of the variables in Phase 1.
VariablesMediaStandard DeviationStandard Error
Implicit emotional valence (VIE)0.14800.03560.0084
Engagement (Eg)0.00250.03090.0073
Table 2. Types of emotions registered in each GIF by the face coder.
Table 2. Types of emotions registered in each GIF by the face coder.
Links 1HappinessSurpriseAngerDisgustFearSadness
GIF10.03070.0050.09140.04870.00970.1062
GIF20.05790.0050.09030.03100.01190.1499
GIF30.04750.01210.08520.01930.01980.1304
GIF40.04140.00420.0830.02720.00590.1455
GIF50.0290.0470.07930.00920.01620.1263
GIF60.04150.05360.08270.01690.00930.1242
GIF70.04590.07050.06490.01210.00530.123
GIF80.05190.06280.09770.02000.00960.1499
GIF90.03710.00590.06280.03910.00780.1013
GIF100.05280.00080.08020.02380.00760.1395
GIF110.03980.06650.07220.02060.03220.1245
GIF120.04090.03830.08000.02420.00970.1415
GIF130.05760.00270.05060.01890.00560.1445
GIF140.04550.00290.08840.04280.00390.1305
GIF150.03530.01290.07780.03610.00860.1136
GIF160.03940.03430.06440.00830.00690.1248
GIF170.01840.03280.08170.02900.03660.1396
GIF180.04220.01520.10210.03470.00820.1379
Media0.04190.02630.07970.02570.01190.1307
1 Links to GIFs for inclusion in Table 2 were obtained on 15 May 2021.
Table 3. Variables, dimensions and tools of Phases 1 and 2.
Table 3. Variables, dimensions and tools of Phases 1 and 2.
VariableDimensionTool
Implicit Emotional Valence (VIE)Value of emotionFace coder
Engagement (Eg)Emotional stateFace coder + GSR
Explicit Emotional Valence (VEE)Value of emotionTwinword
Difference VIE y VEE (VD)Difference between valencesVEE−VIE
Comment length (LgC)Number of elements that appear in the commentElement Counter (own design)
Proportion of Emojis (Pemj)Percentage of emojis over total number of elements in a commentElement Counter
(own design)
Type of basic emotionIdentification of basic emotionsTwinword Emotion Analysis API (explicit method)
Face coder (implicit method)
Table 4. Result for normality test.
Table 4. Result for normality test.
Shapiro–Wilk
Data OriginStatisticSig.
Biometric tool (VIE)0.9690.779
Instagram comments (VEE)0.9430.323
The correlation was significant at the 0.05 level (bilateral).
Table 5. Statistical results of emotional valences.
Table 5. Statistical results of emotional valences.
Data OriginAverageStandard DeviationStandard Error
Biometric tool (VIE)0.14800.03560.0084
Instagram comments (VEE)0.40680.03050.0072
Table 6. Correlation between the variable Implicit–Explicit Valence Difference and Engagement.
Table 6. Correlation between the variable Implicit–Explicit Valence Difference and Engagement.
Engagement (Eg)Implicit-Explicit Valence Difference (VD)
Engagement
(Eg)
1−0.546 *
0.019
Implicit–Explicit Valence Difference (VD)−0.546 *1
0.019
* The correlation is significant at the 0.05 level (bilateral).
Table 7. Correlation between the variable Explicit–Implicit Valence Difference and the percentage of emojis in the comment.
Table 7. Correlation between the variable Explicit–Implicit Valence Difference and the percentage of emojis in the comment.
Percentage of Emojis in the Comment (Pemj)Explicit–Implicit Valence Difference (VD)
Percentage of emojis in the comment (Pemj)Pearson’s correlation10.631 *
Sig. (bilateral) 0.005
Explicit–Implicit Valence Difference (VD)Pearson’s correlation0.631 *1
Sig. (bilateral)0.005
* The correlation was significant at the 0.01 level (bilateral).
Table 8. Number of GIFs containing each type of emotion by measurement.
Table 8. Number of GIFs containing each type of emotion by measurement.
HappinessSurpriseAngerDisgustFearSadness
Biometric tool181818181818
Instagram comments18120061
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Back to TopTop