Next Article in Journal
The Role of Simulators in Interdisciplinary Medical Work
Previous Article in Journal
The Cost of Production in Elicitation Studies and the Legacy Bias-Consensus Trade off
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Response to Impactful Interactivity on Spectators’ Engagement in a Digital Game

by
Raphaëlle Brissette-Gendron
1,*,
Pierre-Majorique Léger
1,
François Courtemanche
1,
Shang Lin Chen
1,
Marouane Ouhnana
2 and
Sylvain Sénécal
1
1
Tech3Lab, HEC Montréal, Montréal, QC H3T2A7, Canada
2
Moment Factory, Montréal, QC H2V4H8, Canada
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2020, 4(4), 89; https://0-doi-org.brum.beds.ac.uk/10.3390/mti4040089
Submission received: 16 October 2020 / Revised: 23 November 2020 / Accepted: 1 December 2020 / Published: 4 December 2020

Abstract

:
As gaming spectatorship has become a worldwide phenomenon, keeping the spectator in mind while designing games is becoming more important. Here, we explore the factors that influence spectators’ engagement. Through the use of GRiD Crowd, a game akin to life-size Pong, different levels of spectator influence on the game were tested and their impact on engagement via arousal measures were analyzed. Spectator influence on the game was accomplished via smartphone, where 78 participants put in different audience compositions (alongside friends or strangers) were tested. We found that when the spectators had an impact on the game, higher levels of emotional arousal were recorded, which generated an increase in engagement. These results provide a suggestion of design that could be used by game designers who wish to engage their spectatorship, a segment of their target market that is becoming impossible to ignore.

1. Introduction

With an estimate number of 303 millions of esport spectators worldwide in 2020 (growth of 163% from 2015), the importance of the spectator in digital games is becoming undeniable [1]. A shift in interest from the user to the spectator has been observed in the literature about human–computer interaction (HCI) in the last two decades. This shift is also seen within the industry with broadcasters such as Twitch recently allowing viewers to vote on what happens next in a game using the chat function. This shift has been seen in interactive installations in public spaces [2], in video games [3] and in games in public spaces [4,5]. The literature on digital games suggests that spectators can play a variety of roles that can influence, in different ways, the experience of players [6]. Downs et al. recommend game designers to be more aware of the importance of the spectator when they create digital games. Moreover, the importance of finding new ways to design user interactions in digital environment has been emphasized in recent research [7]. Research provides us with a better idea of how the spectators behave in the context of watching other people interacting with systems [3,6,8]. Although these studies about digital games provide information about behavior, they do not recommend a specific type of design that would enhance the engagement of the spectators in the game. That is why it was of interest to investigate the factors of engagement of spectators of a social digital game.
To investigate the factors of engagement, we used GRiD Crowd [9], a game akin to life-size Pong [10]. The game is played with one player on each side of the playground and with spectators on the sidelines. In the version used in this research, the spectators have access to an interface with which they can modify the gameplay. Audience interaction has been studied in the entertainment industry [11] and interactivity has been found to have an effect on engagement [12]. We manipulated the use of the interactive feature by spectators to investigate if influencing the game would increase their engagement. We hypothesize that adding interactivity through the use of a smartphone should increase arousal, which, in turn, should increase engagement.
There were 78 participants in our study. A within-subject design was used for the interactivity condition, where each member of the public played three games in total. They were randomly assigned to two games with access to their smartphone to influence the gameplay and one game without access to their smartphone. A subsample of spectators was randomly selected in order to assess their physiological arousal.

1.1. Spectator Experience

A very influential study on spectators’ experience was carried out by Reeves et al. [8]. They raise the importance of the manipulations and effects in relations to the interactions with a digital system. The manipulations pertain to the actions taken by the user that are perceived by the system. The effects are the responses of these manipulations by the system. The ways in which these two phenomena are perceived or not by the spectator influence their experience. Their taxonomy lists four types of designs that can influence the appreciation of the spectators. There is the “secretive” design, where the spectators are not aware of the manipulations nor their effects, “expressive” design where the spectators are aware of both the manipulations and the effects, “magical” design, where only the effects are revealed and finally “suspenseful”, where the manipulations are perceivable but not the effects [8]. Cheung and Huang built on that work with a shift from interactions in a public space to games in the private space. They suggest that information asymmetry between the player and the spectator is key to create suspense in the experience of the spectator [3]. Tekin and Reeves also studied games played in the home [13]. In the light of their research, they invite game designers to offer information only available to spectators, but which contribute to the game [13]. This recommendation comes from their discovery of the ‘dual vision’ which means that the spectator can not only analyze the player’s moves, but also the game itself. The spectator becomes an “assistant” to the player by giving playing advice. This role is similar to the ephemeral “coach” role described by Downs et al. after observing that spectators were giving verbal advice to the players in a study about social video-gaming [6]. Downs et al. then recommended game designs where the spectators that would be identified as “coaches” could have access to information that the players do not have.

1.2. Engagement

User engagement is a concept that has been omnipresent in the HCI literature for the last two decades [14]. Industry leaders are also trying to find new ways to gain sustained interest from their users in a highly competitive market. Many authors have tried to define and measure engagement [14]. One area that has been important in engagement research is education. Student engagement is defined as having three components: behavior, emotion, and cognition [15]. This conceptualization has been used in other areas such as digital systems. For O’Brien et al., engagement is “a quality of user experience characterized by the depth of an actor’s cognitive, temporal, affective, and behavioral investment when interacting with a digital system” [14].
Because engagement is a multidimensional construct, it is difficult to simply ask users if they felt “engaged”. This term may be confusing and some people mistake it for appreciation or simply do not know how to rate their level of engagement [16]. To resolve this, many authors have used multiple dimensions to measure engagement. For instance, Webster and Ho used the following dimensions: challenge, feedback, control, variety, attention focus, curiosity, intrinsic interest, and overall engagement [17]. Mayes and Cotton suggested five factors: interest, authenticity, curiosity, involvement, and fidelity [18]. Brockmyer et al. also used multiple engagement dimensions (immersion, presence, flow, and psychological absorption) to study the impact of violent video games [19].
More recently, O’Brien et al. proposed the User Engagement Scale Short Form (UES-SF) [14]. It was based on the Long Form (UES-LF) that has been used in many fields of research including video games [20]. The robustness of the UES-LF has been re-evaluated in 2016 in regard to its dimensionality, reliability, validity, and generalizability [21]. In this re-evaluation, it was found that many authors did not use the scale in its entirety. This led to the creation of a shorter version. Since then, authors have used it to measure engagement in interactive media [22], brand recognition [23], chatbot interaction [24] and other digital systems.
This scale is especially validated for western adults in the context of evaluating digital technologies and its short length makes it suitable for within-subject studies [14]. It can be administered more than once according to the needs of the research [14].
Following O’Brien’s definition of engagement, The UES-SF is multidimensional. It is comprised of perceived usability, aesthetic appeal, focused attention, and reward [14]. The perceived usability subscale is the “negative affect experienced as a result of the interaction and the degree of control and effort expended” [11]. The aesthetic appeal is the “attractiveness and visual appeal of the interface” [11]. Focused attention means “feeling absorbed in the interaction and losing track of time” ([11], p. 30). Lastly, there is reward, which is “a single set of items made up of the endurability, novelty and felt involvement components in the original UES” [11].
The use of a combination of self-reporting questionnaires and physiological measures is a common and validated approach for measuring game experience [25]. Engagement was found to correlate with physiological arousal [26], which means the activation of the body system as opposed to sleepiness [27]. A widely used method of measuring physiological arousal in games is electrodermal activity (EDA) [28]. EDA is measured through conductivity of the skin that varies according to the differences in sweat gland activity [29]. This gives access to real-time emotional variations. It also correlates with perceived measures of arousal which can be assessed with a self-reporting scale [30].
As opposed to measuring player experience, where it is common to use this combination of methods, it has been rarely used in the context of studying spectator’s experience. One study that used it this combination of methods was that of Latulipe et al., where they found a strong correlation between self-reported arousal and physiological arousal [16]. Their findings also validated that spectators’ engagement is reflected by physiological measures of arousal.

1.3. Hypothesis Development

Building upon research that used simple gameplay to study audience participation [31] and studies on different technologies that can trigger audience participation [32], we used the game mechanics of the well-known arcade game Pong [10]. One of the main conclusions that these previous studies provide is that “the greatest challenge lies not in developing the technology for audience interaction, but in designing engaging activities” [32]. That is why we put our focus on finding the most “engaging” design for spectators. The literature suggests that promoting social interactions or giving many interactive options can lead to higher engagement [12]. As Sid Meier, a renowned video game designer, said, “a (good) game is a series of interesting choices” [33]. This shows that making choices or influencing a game are crucial components of a good gaming experience. A previous study that also used the game mechanics of Pong stated that if the users are not aware that the effects of their actions are linked to the gameplay, they will not continue playing [32]. A study about museums and learning also suggests that adding an overt interactive component to a digital experience enhances cognitive engagement [34]. Other studies in the field of learning and education found that an important correlate of engagement is arousal [26]. This was also stated in a study about audience engagement [16]. As their study suggests, temporal physiological arousal reflects audience engagement. These studies highlight the relationship between interactivity and engagement and between arousal and engagement, our research investigates the relationship between interactivity and engagement through the mediation of arousal. Thus, we posed the hypothesis that giving access to interactivity to the spectators would increase their arousal, which, in turn, would increase their engagement.

2. Materials and Methods

2.1. Experimental Design and Sample

The version of Pong [10] used for this research was developed by Moment Factory [9] and has been scaled to be human sized. GRiD Crowd [9], which is the name of this version, is projected on the ground and the players use their bodies as controllers. Movement detection technology allows the paddle to follow the two players, situated at opposite sides of the playground. An image of the two players is presented in Figure 1. As opposed to Loren and Rachel Carpenter’s and Mayne Aminzade’s experiments, which used similar game mechanics, here, the spectators are not actively playing the game. They are rather modifying the experience of the players by using an interface that either facilitates or hinders the gameplay. They are given access to a mobile web application on their smartphone on which they can vote for certain power-ups or obstacles (see votes list below). The option with the most votes wins. Spectators do not become players when they are given a means to interact; instead, they are influencing the parameters of the game. Using the terms of Reeves et al., the spectators manipulate the digital system to send effects to the game [8]. The players are only aware of the effects, not the manipulations used by the spectators, similar to the “magical” design concept, but pertaining to the players [8].
There were 8 sessions of a maximum of 12 people that took part in 3 games. To allow the experiment to last for an hour, we set a maximum of three games. We did not want to go over that as to not lose the attention of our participants. Two people per group were randomly assigned as players. Two or three spectators per group (18 in total) were randomly selected to wear a device assessing their physiological arousal (i.e., EDA).
A within-subject design was used for the interactivity variable, in order to assess the difference between having an impact on the game and not. Each member of the public was randomly assigned to 2 games with access to their smartphone to influence the gameplay and 1 game without access to their smartphone. When they have access to their smartphone, the votes appear at regular intervals and the spectators have 10 s to vote. An example of the votes is shown in Figure 2. Other votes that appeared were:
  • Fast ball or slow ball?
  • Enlarge paddle of blue player or red player?
  • Accelerate the ball for blue player or red player?
  • Bigger or smaller ball?
  • Which one do you prefer between Godzilla and King Kong?
When a vote pertains to two players, they both experience the effect. For enlarging the paddle or accelerating the ball, only one player (blue or red) experiences the effect, with the colours of each player being indicated on the projected ground. When a spectator answers, they see how many people voted for the same choice as them, as shown in Figure 3. Each spectator voted approximately 5 times per game, which lasted for 5 minutes on average.
There were a total of 78 spectators in our study. The majority of the participants were aged between 18 and 25 years old. None of them had previously played the game tested (GRiD crowd). Each data collection session consisted of either a full group of strangers or a full group of friends. The strangers were recruited using the research partner’s official social media page and our institution panel of participants. We then ensured that they did not know each other. Friends were recruited among social groups known by the research team. We verified that they were friends to book them into time slots. All the participants signed a participation consent form. The research was approved on 18 April 2019 by our institution’s ethics review board, the HEC Montréal Comité d’éthique de la recherche (project code: 2020-3490). Participants were compensated with a draw for two show tickets. To be part of the research, participants needed to be 18 years old or over, be able to stand for 20 minutes, possess a recent smartphone (less than 5 years old), not have skin allergies or sensitivities, not have a pacemaker, not suffer from epilepsy, and not have a diagnosed health problem.

2.2. Procedure

After signing the consent form, participants filled in a paper questionnaire. The participants that were assigned the numbers 1 or 2 were provided with a physiological device that was installed on the palm of their hand. This assignment was random since their numbers referred to the numbers on their jerseys and they could choose any jersey. Then, all participants entered the studio where the game was played and were free to position themselves anywhere around the playground. They were asked to relax and fix something in front of them for 2 min, this was used as a baseline for the physiological devices. Then, they were asked to imagine that they were walking by this game in a public space and they had time to participate. They were told it was similar to ping-pong and that they would easily understand the gameplay. Two people were randomly assigned as players and were asked to stand at each side of the playground (Figure 1). The researcher then stated who would use their smartphone for that game using a short URL. Each game lasted for 3 points. A point is registered when the ball misses the paddle. Then, two other games were played with the same players, but different participants were told they could use their smartphone. Between each game, all spectators filled in a short questionnaire, the Self-Assessment Manikin (SAM) Scale, on their smartphones. At the end of the three games, participants filled in two other paper questionnaires, the User Engagement Scale and a qualitative questionnaire, and then were free to leave. Questions were asked in the form of statements that participants answered using a 5-point Likert scale. Some statements were: I lost myself in this experience, I was absorbed in the experience, this experience appealed to my senses. The participants that had the physiological device stayed to answer interview questions. They were asked to talk about their appreciation of the experience with and without the smartphone, which was also addressed in the qualitative questionnaire.

2.3. Measures and Apparatus

Since familiarity may have an effect on EDA [35], we controlled for the composition of the audience with four sessions that were comprised of people that were friends and four sessions comprised of strangers. The game was projected on the ground via Barco F90-W13 Projectors. One Velodyne LiDAR VLP-16, Morgan Hill, United States, was used to detect the movements of the players, which made the paddle move accordingly. The spectators used their own smartphones to vote on the power-ups and obstacles.
EDA was measured with a portable apparatus. Previous research using similar apparatuses showed that ecological validity was not affected due to the non-intrusive nature of the equipment [36,37,38]. Portability was necessary since the participants were standing and moving. The device consisted of a BITalino (r)evolution Freestyle Kit (PLUX Wireless biosignals S.A.) [39] installed in a 3D-printed box that hung on the belt of the subjects (see Figure 4). Three cameras in total filmed the right, the left, and the overview of the playground. Two of the cameras were Sony AS7 models and the other one was a GoPro 5. One of the three cameras was recording sound with a plugged-in Rode microphone. Every participant had their number on their chest.
To synchronize each game to variations in the EDA recordings, we used a synchronization technique developed by Courtemanche et al. Bluetooth Low-Energy (BLE) signals were sent from a sync box simultaneously to the EDA recording 3D-printed boxes, and to light boxes. The signals were incrementing numbers that started at one. The light boxes showed the synchronization numbers to the cameras, and the same numbers were also logged in the EDA data file (see Figure 5).
A baseline of two minutes was performed before starting the game recordings. Physiological measures were taken on a total of 18 participants (23% of all participants). However, 6 of our 18 participants had too many movement artefacts to be analyzed and thus were discarded.
Arousal can also be confounded with frustration or disorientation when measured with EDA [40]. To avoid misinterpretation of arousal, the literature suggests crossing the data with questionnaires and interviews to understand why variance may occur [29]. Therefore, combining perceived arousal, measured with a self-report scale [29], and lived arousal, measured with physiological data, contributes to more accurate results [16]. Only the spectators’ experience was measured, the players were not asked to fill in questionnaires and were not assigned a physiological device.
The arousal dimension of the Self-Assessment Manikin (SAM) Scale was used to assess perceived arousal [41]. Following the procedure used in a study of games in arcade halls [42], the SAM Scale was administered directly after each game, using a link, provided that participants could access on their smartphones. They were asked to answer using a visual 9-point Likert scale ranging from calm to excited.
After the three games ended, the User Engagement Scale Short Form [14] was administered two times: one time thinking about the experience with the smartphone and one time thinking about the experience without the smartphone. A 5-point Likert scale ranging from “Strongly agree” to “Strongly disagree” was used.

2.4. Analysis

The reliability of the UES-SF was assessed. Its Cronbach’s alpha was 0.71, which is acceptable [43]. Thus, the overall mean of all scale items was used in the analysis as the dependent variable.
EDA values were standardized, then baselined. To standardize, we subtracted the EDA mean from the EDA value, which we then divided by its standard deviation (where the mean and standard deviations are based on the entire dataset) [44]. For the baseline, we subtracted the mean of the baseline EDA from each EDA standardized value, where the mean is based on the baseline data for each participant in question.
Baron and Kenny’s procedure [45] was used to assess the mediation role of arousal in the relationship between interactivity and spectator engagement. We tested two mediation models, one for each type of arousal: physiological and self-reported. Three steps are required for this procedure. First, we tested the relationship between the independent variable and the mediators. Second, we tested the relationship between the independent and dependent variables. Third, we assessed the relationship between the combination of the independent variable and the mediators, and the dependent variable. A linear regression with a random intercept and a two-tailed level of significance was used to detect the relationships between independent and dependent variables and the possible mediators.

3. Results

Table 1 shows the perceived arousal (self-reported), physiological arousal (EDA) and engagement means per experimental conditions. Perceived arousal was on a nine-point Likert scale and the engagement dimension was on a five-point Likert scale. Figure 6 is a visual representation of our model with the p-values for each variable relationship.
In the first step of the Baron and Kenny procedure, the results suggest that there is a significant relationship between interactivity and the two arousal mediators (perceived arousal: β = 0.927, t(150) = 4.42, p < 0.001, 95% CI (0.58, 1.274) and physiological arousal: β = 0.172, t(3666) = 5.16, p < 0.001, 95% CI (0.117, 0.227)). In the second step, results suggest that there is also a significant relationship between interactivity and engagement (β = 0.171, t(153) = 3.58, p < 0.001, 95% CI (0.092, 0.250)). Third, the relationship between interactivity and engagement remains significant (β = 0.0746, t(147) = 5.08, p = 0.039, 95% CI (0.050, 0.099)) when perceived arousal is included in the model, suggesting a partial mediation of perceived arousal. In addition, the relationship between interactivity and engagement remains significant (β = −0.1591, t(3665) = −26.06, p < 0.001, 95% CI (−0.17, −0.149)) when physiological arousal is included in the model, again suggesting a partial mediation of physiological arousal. These results support our hypothesis.
The control variable, which was familiarity (with friends or strangers), was not significant.

4. Discussion

We found that physiological and perceived arousal were higher with interactivity, which, in turn, increased engagement. This means that when the spectators could influence the game, their arousal was higher, which increased their engagement in the game.
Our first contribution is responding to Downs et al. [6] and Tekin and Reeves’ [13] suggestion to offer information only available to the spectator, but which also contributes to the game. To analyze the potential benefits of this suggestion, we used Mayne-Aminzade’s focus on engagement and found that it resulted in an increase in engagement [32]. We are also building on Reeves et al.’s taxonomy, which focused on the revealing and hiding of information from the spectator. According to them, the hiding or revealing of certain manipulations and effects on the player can influence the spectator’s experience with the digital system. In addition to that, we are suggesting that hiding some elements of the manipulations of the spectator is a good way to enhance the spectator’s engagement [8]. Further research could investigate if having spectators influence the game also changed the players’ experience.
Second, our results are also in agreement with the museum engagement literature [34] and the player engagement literature [12], which state that interactivity is a factor of engagement by showing that this can also apply to spectators of a game.
Third, building on the performing arts audience literature that states that arousal is a reflection of engagement, we further show the underlying process of the impact of interactivity on engagement for spectators [16]. Being active rather than passive, or activation generated by interactivity, was spontaneously mentioned by 24% of the participants when asked why they preferred the experience with smartphone interactivity—again suggesting that arousal was an important factor for the participants.
These findings could also be useful for game designers that wish to create games that are engaging for the players but also their spectators. As esport spectatorship is growing in popularity, industry leaders will need to take into account this segment of their target market. Adding interactivity could mean higher levels of engagement for a larger number of people.
It is important to note that the survey-based method is subjective and relies on the memory of the participants. Because the questions of the UES-SF were asked after the three games ended, some information may have been lost [25]. Asking participants to fill in the questionnaire in between each game on the smartphone may have prevented potential memory effects.
This study could benefit from a recollection of data in an actual public space where people come and go without a role being imposed upon them. This could take into account the importance of organic transition from spectator without smartphone, to spectator with smartphone, to player, as mentioned in other terms by Wouters et al. [2]. This would also give the study more ecological validity since its aim is to be deployed in a public space.

5. Conclusions

Spectator engagement is highly relevant to the gaming industry, which is facing the growing popularity of esport spectatorship. The subject of spectator involvement in games has been increasingly studied for the past two decades in the literature. As Downs et al. [6] and Tekin and Reeves [13] suggested, we gave some information only accessible to the spectator and found that, in our particular setting, spectators felt more arousal and therefore more engagement in a game that they were watching when they could influence the parameters of the game. Further research could take into account the motivations of the spectators to identify if there is a difference between helping and hindering the players’ game and also at what point these effects on the game put the spectator in the position of a gamer. Many other factors such as demographics could be assessed to determine what influenced engagement. Our aim with this research is to suggest a way in which to design experiences that take into account the engagement of the spectator. As the importance of designing new forms of interactions in digital contexts has been emphasized in the recent literature [7], our study opens up a world of opportunities in the types of impacts spectators could have on games.

Author Contributions

Conceptualization, R.B.-G., P.-M.L., M.O. and S.S.; data curation, R.B.-G.; formal analysis, R.B.-G., P.-M.L., S.L.C. and S.S.; funding acquisition, R.B.-G. and P.-M.L.; investigation, R.B.-G.; methodology, R.B.-G., P.-M.L., F.C., S.L.C. and S.S.; project administration, R.B.-G.; resources, R.B.-G., P.-M.L., F.C. and S.S.; software, R.B.-G., P.-M.L., F.C. and S.S.; supervision, P.-M.L. and S.S.; validation, R.B.-G., P.-M.L., S.L.C. and S.S.; visualization, R.B.-G.; writing—original draft, R.B.-G.; writing—review and editing, R.B.-G., P.-M.L., F.C., S.L.C., M.O. and S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was made possible because of the MITACS “acceleration” funding program.

Acknowledgments

We thank Moment Factory for letting us use their game “GRiD Crowd”.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Warman, P. 2017 Newzoo Global eSports Market Report; Newzoo: Amsterdam, The Netherlands, 2017; p. 25. [Google Scholar]
  2. Wouters, N.; Downs, J.; Harrop, M.; Cox, T.; Oliveira, E.; Webber, S.; Vetere, F.; Vande Moere, A. Uncovering the honeypot effect: How audiences engage with public interactive systems. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems, Brisbane, QLD, Australia, 4–6 June 2016; pp. 5–16. [Google Scholar] [CrossRef]
  3. Cheung, G.; Huang, J. Starcraft from the stands: Understanding the game spectator. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 763–772. [Google Scholar] [CrossRef]
  4. Benford, S.; Crabtree, A.; Reeves, S.; Sheridan, J.; Dix, A.; Flintham, M.; Drozd, A. Designing for the opportunities and risks of staging digital experiences in public settings. In Proceedings of the Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 22–27 April 2006; pp. 427–436. [Google Scholar] [CrossRef]
  5. Tan, L.; Chow, K.K.N. An embodied approach to designing meaningful experiences with ambient media. Multimodal Technol. Interact. 2018, 2. [Google Scholar] [CrossRef] [Green Version]
  6. Downs, J.; Vetere, F.; Smith, W. Differentiated participation in social videogaming. In Proceedings of the OzCHI ’15: Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction, Parkville, VIC, Australia, 7–10 December 2015; pp. 92–100. [Google Scholar] [CrossRef]
  7. Rubio-Tamayo, J.L.; Barrio, M.G.; García, F.G. Immersive environments and virtual reality: Systematic review and advances in communication, interaction and simulation. Multimodal Technol. Interact. 2017, 1, 21. [Google Scholar] [CrossRef] [Green Version]
  8. Reeves, S.; Benford, S.; Malley, C.O.; Fraser, M. Designing the spectator experience. Reeves, Steve Benford, Claire O’Malley, Mike Fraser. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’05), Portland, OR, USA, 2–7 April 2005; pp. 741–750. [Google Scholar]
  9. GRiD Crowd; Moment Factory: Montréal, QC, Canada, 2016.
  10. Alcorn, A. Pong; Atari, Inc.: Sunnyvale, CA, USA, 1972. [Google Scholar]
  11. Portalés, C.; Casas, S.; Vidal-González, M.; Fernández, M. On the use of ROMOT—A robotized 3D-movie theatre—To enhance romantic movie scenes. Multimodal Technol. Interact. 2017, 1, 7. [Google Scholar] [CrossRef] [Green Version]
  12. Rozendaal, M.C.; Braat, B.A.L.; Wensveen, S.A.G. Exploring sociality and engagement in play through game-control distribution. AI Soc. 2010, 25, 193–201. [Google Scholar] [CrossRef] [Green Version]
  13. Tekin, B.S.; Reeves, S. Ways of spectating: Unravelling spectator participation in Kinect play. In Proceedings of the CHI ’17: CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; pp. 1558–1570. [Google Scholar] [CrossRef] [Green Version]
  14. O’Brien, H.L.; Cairns, P.; Hall, M. A practical approach to measuring user engagement with the refined user engagement scale (UES) and new UES short form. Int. J. Hum. Comput. Stud. 2018, 28–39. [Google Scholar] [CrossRef]
  15. Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School engagement: Potential of the concept, state of the evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef] [Green Version]
  16. Latulipe, C.; Carroll, E.A.; Lottridge, D. Love, hate, arousal and engagement. In Proceedings of the CHI ‘11: CHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 1845–1854. [Google Scholar] [CrossRef]
  17. Webster, J.; Ho, H. Audience Engagement in Multimedia Presentations. Data Base Adv. Inf. Syst. 1997, 28, 63–77. [Google Scholar] [CrossRef]
  18. Mayes, D.K.; Cotton, J.E. Measuring Engagement in Video Games: A Questionnaire. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2001. [Google Scholar] [CrossRef]
  19. Brockmyer, J.H.; Fox, C.M.; Curtiss, K.A.; McBroom, E.; Burkhart, K.M.; Pidruzny, J.N. The development of the Game Engagement Questionnaire: A measure of engagement in video game-playing. J. Exp. Soc. Psychol. 2009, 45, 624–634. [Google Scholar] [CrossRef]
  20. O’Brien, H.L.; Toms, E.G. The Development and Evaluation of a Survey to Measure User Engagement. J. Am. Soc. Inf. Sci. 2010, 61, 271–287. [Google Scholar] [CrossRef] [Green Version]
  21. O’Brien, H.L.; Cairns, P. Theoretical Perspectives on User Engagement; Springer: Berlin, Germany, 2016; ISBN 9783319274447. [Google Scholar]
  22. Carlton, J.; Jay, C.; Brown, A.; Keane, J. Inferring user engagement from interaction data. In Proceedings of the CHI EA ’19: Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland, UK, 4–9 May 2019; pp. 1–6. [Google Scholar] [CrossRef]
  23. Arce-Lopera, C.; Rodríguez, B.; Avendaño, G.; Victoria, D. In store shelf display technology for enhancing customer brand recognition. ACM Int. Conf. Proc. Ser. 2018, 416–420. [Google Scholar] [CrossRef]
  24. Ruan, S.; Jiang, L.; Xu, J.; Tham, B.J.K.; Qiu, Z.; Zhu, Y.; Murnane, E.L.; Brunskill, E.; Landay, J.A. QuizBot: A Dialogue-based Adaptive Learning System for Factual Knowledge. In Proceedings of the CHI ’19: 2019 CHI Conference on Human Factors in Computing Systems, Glasgow, Scotland, UK, 4–9 May 2019; pp. 1–13. [Google Scholar] [CrossRef] [Green Version]
  25. Wiemeyer, J.; Nacke, L.; Moser, C.; ‘Floyd’ Mueller, F. Player Experience. In Serious Games; Dőrner, R., Gőbel, S., Effelsberg, W., Wiemeyer, J., Eds.; Springer: Cham, Switzerland, 2016; pp. 243–271. [Google Scholar] [CrossRef]
  26. Charland, P.; Léger, P.M.; Sénécal, S.; Courtemanche, F.; Mercier, J.; Skelling, Y.; Labonté-Lemoyne, E. Assessing the multiple dimensions of engagement to characterize learning: A neurophysiological perspective. J. Vis. Exp. 2015, 1–8. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  27. Lang, P.J. The Emotion Probe. Am. Psychol. Assoc. 1995, 50, 372–385. [Google Scholar] [CrossRef]
  28. Martey, R.M.; Kenski, K.; Folkestad, J.; Feldman, L.; Gordis, E.; Shaw, A.; Stromer-Galley, J.; Clegg, B.; Zhang, H.; Kaufman, N.; et al. Measuring Game Engagement: Multiple Methods and Construct Complexity. Simul. Gaming 2014, 45, 528–547. [Google Scholar] [CrossRef]
  29. Nacke, L.E. Games user research and physiological game evaluation. In Game User Experience Evaluation; Springer: Cham, Switzerland, 2015; pp. 63–86. [Google Scholar]
  30. Lang, P.J.; Bradley, M.M.; Hamm, A.O. Looking at Pictures: Evaluative, Facial, Visceral, and Behavioral Responses. Psychophysiological Res. 1993, 30, 261–273. [Google Scholar] [CrossRef]
  31. Carpenter, L. Video Imaging Method and Apparatus for Audience Participation. U.S. Patent US5365266A, 15 November 1993. [Google Scholar]
  32. Maynes-Aminzade, D.; Pausch, R.; Seitz, S. Techniques for interactive audience participation. In Proceedings of the 4th IEEE International Conference on Multimodal Interfaces, ICMI 2002, Pittsburgh, PA, USA, 16 October 2002; pp. 15–20. [Google Scholar] [CrossRef]
  33. Natkins, S. Interactivity in games: The player’s engagement. In Interactivity in Games: The Player’s Engagement; IFIP Advances in Information and Communication Technology; Springer: Berlin/Heidelberg, Germany, 2010; pp. 160–168. [Google Scholar]
  34. Pallud, J. Impact of interactive technologies on stimulating learning experiences in a museum. Inf. Manag. 2017, 54, 465–478. [Google Scholar] [CrossRef]
  35. Mandryk, R.L.; Atkins, M.S.; Inkpen, K.M. A continuous and objective evaluation of emotional experience with interactive play environments. In Proceedings of the CHI ’06: SIGCHI Conference on Human Factors in Computing System, Montreal, QC, Canada, 22–23 April 2006; Volume 2, pp. 1027–1036. [Google Scholar]
  36. Passalacqua, M.; Léger, P.M.; Nacke, L.E.; Fredette, M.; Labonté-Lemoyne, É.; Lin, X.; Caprioli, T.; Sénécal, S. Playing in the backstore: Interface gamification increases warehousing workforce engagement. Ind. Manag. Data Syst. 2020. [Google Scholar] [CrossRef]
  37. Nacke, L.E.; Grimshaw, M.N.; Lindley, C.A. More than a feeling: Measurement of sonic user experience and psychophysiology in a first-person shooter game. Interact. Comput. 2010. [Google Scholar] [CrossRef]
  38. Léger, P.M.; Davis, F.D.; Cronan, T.P.; Perret, J. Neurophysiological correlates of cognitive absorption in an enactive training context. Comput. Human Behav. 2014, 34, 273–283. [Google Scholar] [CrossRef]
  39. Batista, D.; da Silva, H.P.; Fred, A.; Moreira, C.; Reis, M.; Ferreira, H.A. Benchmarking of the BITalino biomedical toolkit against an established gold standard. Healthc. Technol. Lett. 2019, 6, 32–36. [Google Scholar] [CrossRef] [PubMed]
  40. Heather, L. O’Brien and Elaine, G. Toms What is User Engagement? A Conceptual Framework for Defining User Engagement with Technology. Int. Rev. Res. Open Distance Learn. 2013, 14, 90–103. [Google Scholar] [CrossRef]
  41. Bradley, M.M.; Greenwald, M.K.; Petry, M.C.; Lang, P.J. Remembering Pictures: Pleasure and Arousal in Memory. J. Exp. Psychol. Learn. Mem. Cogn. 1992, 18, 379–390. [Google Scholar] [CrossRef] [PubMed]
  42. Mehrabian, A.; Wixen, W.J. Preferences for Individual Video Games as a Function of Their Emotional Effects on Players. J. Appl. Soc. Psychol. 1986. [Google Scholar] [CrossRef]
  43. Nunnally, J.C.; Bernstein, I.H. The Assessment of Reliability. Psychom. Theory 1967. [Google Scholar] [CrossRef]
  44. Braithwaite, J.; Watson, D.; Robert, J.; Mickey, R. A Guide for Analysing Electrodermal Activity (EDA) & Skin Conductance Responses (SCRs) for Psychological Experiments. Psychophysiology 2015, 49, 1017–1034. [Google Scholar]
  45. Baron, R.M.; Kenny, D.A. The Moderator-Mediator Variable Distinction in Social Psychological Research. Conceptual, Strategic, and Statistical Considerations. J. Pers. Soc. Psychol. 1986. [Google Scholar] [CrossRef]
Figure 1. The game GRiD Crowd by Moment Factory (2019) [9].
Figure 1. The game GRiD Crowd by Moment Factory (2019) [9].
Mti 04 00089 g001
Figure 2. Voting page of the spectator’s interface with the timer bar below.
Figure 2. Voting page of the spectator’s interface with the timer bar below.
Mti 04 00089 g002
Figure 3. Page with the number of voters per choice.
Figure 3. Page with the number of voters per choice.
Mti 04 00089 g003
Figure 4. The electrodermal activity (EDA) device. The electrodes on the hand are connected to sensor cables which are connected to the portable EDA device. A black armband is on the arm to secure the sensor cable.
Figure 4. The electrodermal activity (EDA) device. The electrodes on the hand are connected to sensor cables which are connected to the portable EDA device. A black armband is on the arm to secure the sensor cable.
Mti 04 00089 g004
Figure 5. A light box being placed in the frame of a camera.
Figure 5. A light box being placed in the frame of a camera.
Mti 04 00089 g005
Figure 6. Visual Representation of our model.
Figure 6. Visual Representation of our model.
Mti 04 00089 g006
Table 1. Descriptive statistics per group.
Table 1. Descriptive statistics per group.
ConditionsPerceived ArousalPhysiological Arousal (Standardized EDA)Engagement
With interactivity5.540.02953.49
Without interactivity4.64−0.12623.31
N781278
p-value<0.001<0.001<0.001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Brissette-Gendron, R.; Léger, P.-M.; Courtemanche, F.; Chen, S.L.; Ouhnana, M.; Sénécal, S. The Response to Impactful Interactivity on Spectators’ Engagement in a Digital Game. Multimodal Technol. Interact. 2020, 4, 89. https://0-doi-org.brum.beds.ac.uk/10.3390/mti4040089

AMA Style

Brissette-Gendron R, Léger P-M, Courtemanche F, Chen SL, Ouhnana M, Sénécal S. The Response to Impactful Interactivity on Spectators’ Engagement in a Digital Game. Multimodal Technologies and Interaction. 2020; 4(4):89. https://0-doi-org.brum.beds.ac.uk/10.3390/mti4040089

Chicago/Turabian Style

Brissette-Gendron, Raphaëlle, Pierre-Majorique Léger, François Courtemanche, Shang Lin Chen, Marouane Ouhnana, and Sylvain Sénécal. 2020. "The Response to Impactful Interactivity on Spectators’ Engagement in a Digital Game" Multimodal Technologies and Interaction 4, no. 4: 89. https://0-doi-org.brum.beds.ac.uk/10.3390/mti4040089

Article Metrics

Back to TopTop