Next Article in Journal
Students’ Views on Public Transport: Satisfaction and Emission
Next Article in Special Issue
A Sociological View on Designing a Sustainable Online Community for K–12 Teachers: A Systematic Review
Previous Article in Journal
Resilience as a City Brand: The Cases of the Comuna 13 and Moravia in Medellin, Colombia
Previous Article in Special Issue
Application of Quantitative Computer-Based Analysis for Student’s Learning Tendency on the Efficient Utilization of Mobile Phones during Lecture Hours
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Evaluation of a Collaborative Educational Game: BECO Games

1
Electrical and Computer Engineering Department, Universidad Nacional de Educación a Distancia, 28040 Madrid, Spain
2
Department of Computer Science, Universidad de Cadiz, 11001 Cadiz, Spain
3
Department of Methods of Research and Diagnosis in Education II, Universidad Nacional de Educación a Distancia, 28040 Madrid, Spain
4
Department of Economic Theory and Mathematical Economy, Universidad Nacional de Educación a Distancia, 28040 Madrid, Spain
5
Behaviour & Law Foundation, 28040 Madrid, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(20), 8471; https://0-doi-org.brum.beds.ac.uk/10.3390/su12208471
Submission received: 19 August 2020 / Revised: 3 October 2020 / Accepted: 10 October 2020 / Published: 14 October 2020
(This article belongs to the Special Issue Technology-Enhanced Learning, Open Science and Global Education)

Abstract

:
This paper describes the design and validation of a game based on a platform for easy deployment of collaborative educational games, named BECO Games platform. As an example of its potential, a learning experience for an Economics subject was created through a collaborative game to understand the concept of common goods. The effectiveness of the game was tested by comparing the performance of Bachelor students who used the platform and those who did not (137 students vs. 92 students). In addition, it was controlled that in previous years when students played the game through forums and an Excel sheet, these differences did not exist. Results indicate that the performance differences between students who participated in the online game and those who did not were greater than in previous years. In addition, a satisfaction survey was delivered to the students to understand their impressions better. This survey assessed student opinion about the platform, about the educational experience, and about their behavior during the game.

1. Introduction

Serious games stimulate learning through mechanisms that avoid the standard concept of teaching, using games and new forms of interaction [1]. Neuroimaging studies conducted over the past few years have shown how rewards influence dopamine levels and, consequently, the speed at which learning occurs [2,3].
Games include simulations of life with enforced rules, defined roles, and scoring mechanisms for measuring performance [4,5]. The assumption of new roles and risks allows the students to learn new concepts [6].
The Horizon Reports [7] say that educational games encourage critical thinking, teamwork, and help creative problem solving, etc. However, one of the main problems that arise when implementing gamification strategies in educational contexts is the creation and development of games that include the concepts, skills, and values that we attempt to teach [8].
According to Kreps, game theory facilitates the study of behavior in situations in which the actions of one player affects the others [9]. In particular, for the games created for Economics teaching, many experiences use the citizen’s contributions to universal public services such as education, health, or highways to help students understand how cooperative economics works. A citizen can behave as a free-rider trying to avoid paying taxes while enjoying the service. On the other hand, other users think more about the future of society and pay their taxes to contribute to maintain and improve those services.
One of the best examples of this kind of game is the common goods game [10]. It poses a decision between a selfish choice, which maximizes personal gains at the expense of others, and a cooperative choice, which maximizes the overall benefit for all players [11].
This is typically a difficult topic since “homo economicus” is based on self-interest [12]. Therefore, according to an Economics teacher experience, it is not easy to explain that collaborating is better than cheating to economics students. Moreover, it is related to other essential topics in Economics. Students must learn that if they do not contribute to public goods (if you do not pay your taxes, if you try to cheat other community members, etc.) it is impossible to have good public services. To be a free-rider (avoid paying taxes as much as possible) is not a good citizen solution.
A comprehensive analysis of the state of the art of online versions of the common goods game was performed on Google Scholar. We used the terms “online common ‘goods game’ education” without a temporal filter, obtaining 2.370 results. It is important to notice that the words “goods” and “game” should appear together to avoid non-related papers. In addition, this search finds papers about the public goods game, which is the same game. The term “education” helps to narrow the search to only those papers related to education. After analyzing the 200 most relevant works, we concluded that most articles focus on personality analysis and behavioral economics, such as the work from Mussel and Hewig [13], the Pizzagame from Keil [14], the work of Theodorou [15], or the works from Lopez [16], Ulrich [17], and [18]. However, none of the found systems focused on the analysis of student learning improvement thanks to the use of technology.
For that reason, the authors developed an online version of the common goods game to be evaluated from a learning point of view. Both this game and the supporting platform, BECO Games, were developed by the authors and released as open-source. This game was carried out in different academic courses as an activity of the subject Economic Analysis of Tourism of the Degree in Tourism of a Distance Learning University. Thus, the subject was taught through a distance learning, or e-learning, methodology, which biases the game organization and mechanics. The reason is that this game may be played in a classroom in real-time. However, playing this kind of cooperative game with a distance learning methodology, where no synchronous activities are performed due to professional and family impediments, requires more technology and longer periods of time for its execution.
This work may be of interest to those researchers and teachers of similar subjects or using this kind of cooperative games. As the platform and the game are open-source, any researcher could download the source code and deploy their own instance of BECO Games with all its functionalities (personality definition, authentication, learning analytics, multiplayer, multi-device, notifications, logs, charts, etc.) and in particular the common goods game. This could help researchers to make use of previously validated online games instead of non-digital or rudimentary versions of these games. The fact that this software has been validated from technical, usability, and learning point of view is very important to support quality research on top of it.
This paper analyses the use of this online game in such Economics-related subject. To validate the level of contribution of the developed game, the following research questions are established:
  • RQ1: Does the use of this online game provide any improvement in student performance (assessment marks)?
  • RQ2: Are students satisfied with the game?
The paper is structured in five main parts: an introduction; a game and platform description, which introduces the game; a methodology section, which describes the stages of the study; a results section, which includes a compilation of the data obtained from the student performance analysis, software validation and a student satisfaction analysis; a discussion and conclusions section, which contains a critical analysis of the results, findings, highlights, constraints, benefits, and direction for future research.

2. Platform and Game Description

2.1. Platform Description

The BECO Games platform was created to make the development of cooperative games easier. The first one developed based on this platform was an online version of the common goods game.
The BECO Games platform was designed following the teacher needs for deploying online, collaborative, and responsive versions of behavioral economics games. The teacher of the subject, and author of this manuscript, was involved in the definition of all the requirements for both the BECO Games platform and the common goods game. The Behavior-Driven Development (BDD) approach [19] was used during the design of the platform to foster collaboration between the software developer and the domain expert (teacher). In this vein, part of the platform and game behavior was specified by using a set of example interactions through statements in natural language.
According to these defined requisites, the platform provides the following functional features:
  • Users (both teachers and students) can register on the platform via a two-step process through a sign-up form and an email verification.
  • Admin users can manage users and organizations.
  • Teachers can create (and play) games and customize game-specific parameters for each game instance.
  • Teachers can invite students to available games (Figure 1). The students can be manually or randomly grouped according to certain filters, such as gender, age, organization, number of games played, and prosocial behavior profile. The groups are anonymous, so the participants do not know the identity of other group members.
  • Users can provide their behavior profile by filling in an adapted version of the ten-item personality inventory to measure the Big Five personality traits [20], namely agreeableness, conscientiousness, extroversion, openness to experience, and neuroticism. The numerical values are classified into traits with regard to those provided by the rest of the registered students. The traits are then mapped on animal names (lion, tiger, dog, etc.) to facilitate filtering during the user invitation process.
  • Once all the students have responded to the joining invitations (Figure 2), the creator will be able to start the game. However, he/she can launch the game before all invitations have been responded to, as long as at least two participants have joined.
  • Users can play several games asynchronously which will be resolved on creator demand or when reaching certain conditions.
  • The platform provides immediate feedback to the involved users, via in-app notifications: every time a user receives an invitation or joins a game, when there are no more pending game invitations to respond, when game status changes, or when users perform actions during the game, etc.
  • The game results and the user action logs can be easily viewed and exported in a spreadsheet file to enable further analysis, such as the study that revealed that the students with the lowest levels of the conscientiousness and agreeableness traits adopted an anti-social strategy to obtain the most benefits [20].
Several non-functional requirements were taken into account during the development of the platform:
  • Portability. BECO Games is a progressive web application based on web standards so that the games can be played from mobile phones, tablets, laptops, and computers, independently of users’ operating systems.
  • Usability. The platform provides a consistent look and feel by using global fonts, styles, and colors as well as a responsive layout to make sure the app stays easily readable on all devices. Students’ comments collected during the first trials were applied to polish the user interface.
  • Security. An authentication and authorization mechanism based on credentials and roles was included to prevent illegal access to user data. In addition, users can trigger the erasure of their personal data according to the data protection regulation. During the registration process on the platform, users have to give their informed consent to allow researchers to process their activity data. Regarding privacy, only admin users and teachers are allowed to know the real identities of the players.
  • Interoperability. BECO Games supports the eXperience API (xAPI) specification [21], thus the game logs can be sent in the form of noun-verb-object statements to any compatible learning record system, such as Learning Locker. This data transfer is performed on teacher demand via the user interface.
  • Notifications. The platform uses Google SMTP servers to send batches of email messages with app notifications.
  • Maintainability. The architecture of the platform follows the three-layers design pattern, and its base code provides generic functionality in the form of abstract classes which can be further extended to integrate new kinds of games.
The system was developed using Java technologies, namely Spring Framework and Vaadin. The platform was released as open-source through a GitHub repository [22], and there is a running instance on an Amazon AWS server [23].

2.2. Game Description

Based on the BECO Games platform, the Common Goods Game [24] was developed. Its development was also based on teacher needs. For that reason, a teacher of Economics was involved in the definition of pre-requisites and validation of the final version of the game.
This is a strategy game in which the user performance depends not only on their own behavior but also on their partners’ behavior. The economic topic that students learn is Behavioral Economics: Common Goods.
An incentive of up to one extra point in the total final subject grade is offered to participant students to increase participation. The subject is graded from 0 to 10, so an extra point is an interesting reward for participation in a short experience like this one. Participants will have to individually contribute with a given amount of points (from an initial allowance) to a common fund, and then, this common good will be distributed equally between all the players.
Depending on the strategy of each one, the student starts the game with 0.25 points and can end up with less if he/she is “very supportive” and the rest of peers are not, or can reach 0.65 points if he/she plays well his strategy: he/she does not put anything, and the rest put their 0.25 points. Thus, depending on both student and group behavior, the student gets a better grade or not.
The student should be conscious that their results depend not only on their strategy but on the other group members’ strategy as well. If the student is a free-rider and the other members are also free-riders then nobody improves (everybody gets 0.25); if the student is a free-rider then their result depends on how many of the other members of the group are “good citizens”; if he/she is a good citizen their result also depends on the other group members’ strategy. Finally, if everybody contributes to the common good with their initial endowment, then all of them double (0.50 points). Thanks to this game, students learn how public services are funded by putting at risk one extra point of their own final grade.
In essence, students have to learn how to work cooperatively, providing as much as possible to the common good trusting that the peers will act in the same way to get the most in return. However, in the game, they realize that if some student plays as a free-rider, not contributing to the common good, all the participants get less in return.
The economic mechanism is as follows:
  • The student receives 0.25 units to invest in a common good and should decide how much to invest from 0 to 0.25.
  • The student is a member of an anonymous group. Thus, the student does not know the identity of other group members, and they do not know theirs.
  • The contributions to the common good are multiplied by an amount depending on the number of players.
  • At the end of the game, the common good is distributed between players. The student can finish winning more than 0.25 units or losing less than 0.25.
The final result of a student’s gambling adds up from two parts: the amount he/she decides not to contribute (0.25 less than their contribution); the double of the mean of the amount of common good at the end of the game.
The final points obtained in the game, the performance (P), of each participant are computed by the following formula:
P = I X i + w j = 1 N X j N ,
I being the initial endowment (0.25), Xi the individual contribution (ranging from 0 to 0.25), w the weight factor (2 is the factor to multiply the common good mean), Xj the contribution of each group member, and N the number of participants.
The game mechanics are as follows:
  • Teachers and students must register on the platform and, optionally, fill in their behavior profile.
  • Teachers must create a new game and define the game parameters: initial allowance (I) assigned to the participants and the weight factor (w). In addition, they can decide if the platform should automatically resolve the game when all the players have invested and if the participants will be authorized to see the investments and benefits obtained by the other partners.
  • Teachers must invite students, according to their own criteria. The students will receive invitations by email to join the game, which have to be accepted or rejected.
  • Once there are enough players to play, the teacher can start the game, and consequently, the participants will have to decide how much to invest (up to 0.25) in the common goods (Figure 3, left).
Once the game is resolved, every student will be able to check their individual results benefit, their position in the ranking of user benefits of the game, and the group behavior. For that, a bar chart (Figure 3, right) shows the user investment, the partners’ average investment, the user performance, and the partners’ average performance obtained for different scenarios: (i) the current one, (ii) the user has been the only contributor of the group, (iii) the user has acted as a free-rider who has not contributed anything whereas everyone else contributed to the most, and (iv) everyone has contributed with the totality of their allowance. This feedback is very important to help students to consolidate knowledge.
Students started playing the game four weeks before the exam, taking into account that they played two rounds of the game. The students were all grouped into a single group. The reason behind only one big group is that large groups facilitate free-rider students to “hide” and contribute nothing, with a low impact on the global result. However, if the group is small, the individual contribution is significant. Thus, the student knows that the final result will depend on his/her contribution.
Students played the first round, and after knowing the result, played the second round. They had a maximum of two weeks to play each one of the rounds. The reason for this long period of time is because, as previously mentioned, the game was designed to be played in a distance learning environment. In this kind of education, most of the students have family and professional obligations that require to have maximum flexibility with educational deadlines.
Regarding the learning objectives of this game, students should at least learn two important things as economists and as someone who is going to work in the tourism sector: first of all, what a common good is an how you should contribute to it (for instance, you must pay taxes if you want to obtain good public services such as highways, public education, or public health); secondly, that you must think of your strategy before being part of a game. If you are only looking for your self-interest, maybe your results are worse than if you collaborate.

2.3. Support for Learning Analytics

Learning analytics are aimed at understanding and optimizing learning and the environments in which it occurs by conducting a set of activities to measure, collect, analyze, and report data about learners and their contexts [25]. In this sense, BECO Games provides a data source suitable for these purposes.
On the one hand, the teacher can export game results in a tabular format containing the following data fields: ranking, username, user id, the amount invested, and the amount gained. This data can be analyzed with any spreadsheet tool, enabling further studies, such as the one which revealed that the students with the lowest levels of the conscientiousness and agreeableness traits adopted an anti-social strategy to obtain the most benefits [20]. In addition, BECO Games enables users to export detailed game logs to spreadsheets with the following fields: timestamp, user id, username, action type, and data. The action-type field collects the different user events that happen during the game: status change events (Created, Started, Paused, Resumed, Resolved, and Stopped), invitation events (Sent_Invitation, Received_Invitation, Accepted_Invitation, Rejected_Invitation, and Removed), and game-specific events (Invested and Gained). The data field contains event-dependent information, such as the game parameters for the “Start” event and the user performance for the “Gained” event. Since data are exported in a simple event log format, it is possible to use process mining techniques [26] for analyzing other factors, such as user waiting and reaction times.
On the other hand, game data could be combined with user data collected from other kinds of sources, such as learning management systems, for conducting more extensive studies. With that aim, BECO Games supports the xAPI eLearning standard, facilitating the creation of data integration solutions focused on learners’ digital experiences. In addition, to maximize interoperability, the detailed log of each game is exported using an xAPI profile [27] which was specifically designed for serious games. Listing 1 shows an example of the xAPI statements corresponding to the results of a given student.
		"actor": {
		"mbox": "mailto:[email protected]",
		"name": "student123",
		"objectType": "Agent"
		},
		"verb": {
		"id": "http://adlnet.gov/expapi/verbs/completed"
		},
		"object": {
		"definition": {
		"type": "https://w3id.org/xapi/seriousgames/activity-types/serious-game"
		},
		"id": "http://purl.org/becogames/TheCommonGoods/44",
		"objectType": "Activity"
		 },
		"result": {
		"response": "0.15",
		"score": {
		"max": 0.65,
		"min": 0.10,
		"raw": 0.45
		}}
            
Listing 1. An xAPI statement describing that a given user (actor) completed (verb) the game (activity) with a certain result.

3. Materials and Methods

To analyze the effectiveness of the game and student satisfaction with its use, the following process was established:
  • Analysis of software validity. For this task a testing plan was conducted in two stages:
    • Manual and JUnit-based automated tests, which allowed to check if all the possible game scenarios, ranging from poor or null investments to the highest ones, and the user invitation system were correctly managed. Additionally, they checked whether non-functional requirements were properly implemented, such as authentication and authorization mechanisms, concurrency control, communication with external mail servers, asynchronous web notifications, and its responsive interface.
    • Internal validation. It was made with a group of five teachers specialized in behavioral sciences, who offered personal feedback on the improvements to be made on the game. These participants played two rounds of the game and provided written feedback regarding graphical appealing, usability, and game mechanics. This feedback was taken into account in the final improvement process of the development.
  • Evaluation of the effectiveness of the intervention. The validation stage of games is an imperative point when creating a new one to check if it meets its objectives. This is especially relevant in the case of educational online games that intend to be beneficial to the players, whether to practice, learn, or change their perceptions [28,29,30]. With the proposed evaluation of the effectiveness of the tested application, performance differences between a student who played the game and those who did not were analyzed. To guarantee that the differences observed were due to the designed application and not to other possible variables (for example, a greater motivation of students who decide to participate in the game, the content of the game itself, etc.), the results obtained by the students in the academic year 2018-2019 were compared with those of the two previous academic years. In both academic years 2016-2017 and 2017–2018, students participated in the game from an activity that is presented in a rudimentary way against course 2018/19, where students played the online BECO Games. This rudimentary way consisted of the teacher posting in a forum of the virtual course the game instructions. Then students sent their contribution to the common goods through a message in the forum. Finally, the teacher manually collected all contributions from students in an Excel file, calculated the results, and sent them to the students through the forum again.
    The hypothesis to be tested, therefore, is whether the performance of students who participated in the game was higher than those who did not, and if that these differences are larger for the academic year in which the new application was designed since this was the only instructional design element that changed.
  • Analysis of satisfaction with the game. Currently, the most accepted method to investigate students’/players’ perception of serious games is the use of external questionnaires [31]. Therefore, after participating, students received a questionnaire to assess their satisfaction with the game.

3.1. Sample

The sample of the study was composed of the students of the Tourism Degree enrolled in the subject of Economic Analysis of Tourism during the academic years 2016/2017, 2017/2018, and 2018/2019. The researchers recorded their score on the final exam and their participation in the game (Table 1). It should be noted that, in all academic courses, the percentage of students who participated in the game is around 60%.
Regarding the satisfaction survey, the instrument was completed by 33 of the 137 students who used the app during the 2018/2019 academic year. Students received an email inviting them to answer an anonymized questionnaire. They were informed of the purpose of the study, and its completion was optional without extra incentives.

3.2. Instruments

Instruments applied for the collection of the information were a test (final exam) aimed at evaluating the level of knowledge in the subject and a questionnaire of satisfaction with the game.
The students played the game before the exam. They knew the results of the game before having the final exam. They also knew that those results would be added to the final exams’ qualifications.
The final exam contained quiz questions and problems related to economic behavior, common goods, and other Economic Theory topics. The results of the game were added to the final exam’s score to obtain the total qualification. Therefore, the game was important because the final score depended on the results of the game (around 1 point over 10 of the final qualification).
The satisfaction questionnaire was based on the Technology Acceptance Model (TAM). The TAM allows analyzing the acceptance by the users of a new technological innovation [31], and it considers the perceived utility and perceived ease of use by the final users as key factors when deciding on the adoption of a new technology [32]. The questionnaire consisted of 26 six-point Likert-type questions (1: strongly disagree, 6: strongly agree) that covered the following dimensions: 5 of the questions referred to satisfaction with the game, 14 to satisfaction with the educational experience, and 7 of the items were aimed at evaluating the behavior of the students during the game. However, taking into account the purpose of this work, only the questions related to perceived satisfaction with the game and with the educational experience were considered. The final instrument designed contained the following items (Table 2):

3.3. Analysis

In order to analyze the differences in performance between students who participated in the game and those who did not, the Student T test was applied. As a previous step to the application of this technique, it was confirmed that the grades in the final exam were normally distributed in all academic courses: 2016/2017 (Kolmogorov-Smirnov: 0.037; p. 0.772), 2017/2018 (Kolmogorov-Smirnov: 0.038; p. 0.878), and 2018/2019 (Kolmogorov-Smirnov: 0.041; p. 0.881). The magnitudes of the differences were estimated by means of Cohen’s D [32].
The analysis of the satisfaction questionnaire was carried out from a descriptive approach that reports the distribution of responses to the different questions stated.

4. Results

4.1. Software Validity

As previously mentioned, the software validation was made in two stages: manual and automated test and internal validation with five experts. Functional requirements were firstly tested to evaluate that the system behaves as expected. Thanks to the BDD approach, the high-level requirement specifications are not only readable by end-users but are also machine-executable. They enabled us to verify whether the current system implementation provides the desired behavior or not. Some of the platform specifications (using the Gherkin syntax [19]) devised for BECO Games are shown in Listing 2.
Feature: Start game
  As a teacher, I want to start a game with my students
  Scenario: Nobody accepts the joining invitation
    Given I created a new game with an initial allowance of 0.25 and a weight of 2
    And I have sent game invitations to 5 users
    But 0 users accepted to play
    When I select the option to start the game
    Then I got an error message with the text "NOT ENOUGH USERS"
	
Feature: Resolve game
  As a teacher, I want to resolve the game
  Scenario: One user acts as a free rider, that is one user invests nothing and the rest invests at maximum
    Given I created a new game with an initial allowance of 0.25 and a weight of 2
    And I have sent game invitations to 5 users
    And 5 users accepted to play
    When I select the option to start the game
    And user 1 invested 0.0
    And user 2 invested 0.25
    And user 3 invested 0.25
    And user 4 invested 0.25
    And user 5 invested 0.25
    And I select the option to resolve the game
    Then the user 1 gains 0.65
            
Listing 2. A snippet of runnable feature specifications.
The features are provided with test scenarios composed of a set of steps defining the initial context (given), the event (when), and the expected outcome (then). These natural language sentences are then mapped to Java test methods by using the Cucumber framework and eventually run by JUnit. The remaining test specifications and their results are available on the GitHub repository.
In addition to the above, we created a set of additional tests directly as Java methods. Figure 4 shows a snapshot of the Eclipse JUnit plug-in showing the success of the conducted tests.
Besides the Cucumber and JUnit tests, we conducted a code static analysis to check the maintainability degree of the software by using the SonarQube platform. The results [33] show that the current codebase of BECO Games had a technical debt under 3% and an A score, according to the SCALE method [34], which reflects a good starting point to further evolve platform features.
Finally, we checked whether non-functional requirements were properly implemented, such as authentication and authorization mechanisms, concurrency control, communication with external mail servers, asynchronous web notifications, and its responsive interface. For that, manual tests via several mobile and desktop browsers were conducted.
Once the above validation was made, ensuring correct operation of the game, another internal validation stage with experts was made to minimize problems in the game process. Five experts played the game for one round and offered personal feedback on the improvements to be made. Among them, this stage allowed to correct the following problems found:
  • Allow the registration of users with special characters, such as the letter “ñ”.
  • The downloadable instructions for participants are corrected, simplifying them and rewriting some paragraphs that were confusing.
  • The download link of the instructions is placed in a more easily identifiable place.
After making these changes, the experts were invited to play a new round of the game to verify that no problem was found.

4.2. Evaluation of the Effectiveness of the Intervention

Figure 5 shows the average performance of students on the final exam over the three academic years. As can be seen, in all cases, the performance of the students who participated in the game is higher than that of those who did not.
However, the analysis of the significance of these observed differences (Table 3) reflects how they are only statistically significant in the 2018/2019 academic year (−3.089; p. 0.002).
These results are consistent with the estimated effect sizes. The Cohen’s D values calculated for academic years 2016/2017 and 2017/2018 indicate small effects, while in the academic year 2017/2018 a moderate effect size is observed.

4.3. Satisfaction Analysis

In general terms, students value the game positively. The aspects of the game with the highest scores were: ease of playing, being able to play at any time, and clarity of instructions. In these three cases, more than 75% of the students that answered the survey were quite (5) or totally satisfied (6), as can be seen in Figure 6.
On the contrary, the worst-valued aspect was the interface, although even in this case 66% of the students that participated in the survey considered it positive (values 4, 5, and 6).
Regarding the satisfaction of the students with the educational experience (see Figure 7), the majority of the students answering the survey indicated that they enjoyed doing this activity (more than 75% of the subjects were quite or totally in agreement with this statement).
This fact is consistent with their positive assessment, given the fact that there are more similar experiences both in this subject and in the other subjects of the degree, and with their satisfaction with the experience (approximately, 72% of the subjects participating in the survey granted the values 5 and 6). On the opposite extreme, although also with a high degree of agreement (between 66% and 73% of the students answering the survey considered it positive (values 4, 5, and 6)), there is a perception that the game allowed them to clarify the theoretical concepts of the subject, and it allowed them to improve the understanding of the contents under study.

5. Discussion and Conclusions

After the analysis of the state of the art, we can conclude that no cooperative digital game has been found for the topic of common goods with educational purposes. Thus, the game presented in this work represents a step forward in the state of the art, indicating that students who volunteered to play the game performed better in the exam, and students’ perceived satisfaction was very high. In addition, the fact that both the platform and game source code were released as open-source opens the door to many researchers, such as those already working with this game (e.g., Mussel [13], Keil [14], Theodorou [15], or Ulrich [17]), to conduct their own studies based on validated software with many functionalities already implemented (e.g., personality definition, authentication, authorization, concurrency, multi-user, multi-device, learning analytics with xAPI, charts, logs, and notifications).
Regarding the analysis of student performance, threats to validity were taken into account. To maximize the internal validity and construct validity, we maintained a detailed statistical protocol for the analysis of student satisfaction and the validation of student performance. Furthermore, every course edition was led by the same instructor (who is also one of the authors of this work), teaching the same contents and assessing with the same evaluation criteria.
From the analysis of student performance, taking into account not only the significance of the differences but also their magnitude, it was observed how in the academic years 2016–17 and 2017–18 the effect sizes (Cohen’s D) were around 0.20 and, consequently, the differences in performance can be considered small.
However, in the 2018-19 academic year, the effect size was 0.457, which implies that the average performance of the students who participated in the game was almost half a standard deviation above the average of the students who did not participate, implying that the differences would be moderate.
From these findings, we can conclude that the tool increased the performance of students who participated in the game to a greater extent than if the game is delivered through more traditional methods, no matter if they are paper-based or even digital but less attractive.
On the other hand, student satisfaction with the online game is very high in general terms. Students claim for similar tools in other subjects as it increases their motivation.
Regarding limitations, our platform is designed only for cooperative multiplayer short games, thus single-play games are not supported.
As future works, we will deploy new collaborative online games based on the BECO platform. This software was designed in such a way that it is easy to create new similar games just by changing the game mechanics.
Another limitation of the study is that it was not possible to establish an experimental research design to ensure that the higher performance of students participating in the game is due only to their participation and not to other possible variables such as the motivation or the commitment to learning. Similarly, additional studies are required to analyze whether the increase in differences observed during the 2018/2019 academic year is due exclusively to the platform used. Although the findings point in this direction, new research is required to provide empirical evidence on the significant effect of this tool on student learning and to overcome the limitations of internal and external validity present in this study.
In successive studies, strategies should also be adopted to increase the sample of students who respond to the satisfaction questionnaire. As it is an optional questionnaire, the answers offered may be conditioned by the reasons that have led them to respond. This would reduce the likelihood that only those students more or less satisfied with their participation in the game would respond.
We are aware of the importance of gathering more information on the effectiveness of these platforms and student satisfaction, not only in the teaching of economics but also in the learning of other subjects. However, the results achieved encourage further development of educational games on the BECO Games platform, since its characteristics and functionalities make it an ideal tool to be adapted to the online or blended teaching.

Author Contributions

Conceptualization, J.L.C., I.R.-R., and S.M.; methodology, E.L.-M. and S.M.; software, I.R.-R.; validation, J.L.C. and R.L.; formal analysis, E.L.-M.; investigation, S.M.; writing—original draft preparation, S.M. and E.L.-M.; writing—review and editing, J.L.C., R.L., and I.R.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This work was co-funded by the Madrid Regional Government, through the project e-Madrid-CM (S2018/TCS-4307) with structural funds (FSE and FEDER), and the Spanish National Research Agency (AEI), through the project VISAIGLE (TIN2017-85797-R) with ERDF funds. The authors also acknowledge the support from the Industrial Engineering School of UNED with the 2020-IEQ13 grant.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Martin, S.; Lopez-Martin, E.; Lopez-Rey, A.; Cubillo, J.; Moreno-Pulido, A.; Castro, M. Analysis of New Technology Trends in Education: 2010–2015. IEEE Access 2018, 6, 36840–36848. [Google Scholar] [CrossRef]
  2. Wilkinson, P. A Brief History of Serious Games. Lect. Notes Comput. Sci. 2016, 17–41. [Google Scholar] [CrossRef]
  3. Shohamy, D.; Adcock, R.A. Dopamine and adaptive memory. Trends Cogn. Sci. 2010, 14, 464–472. [Google Scholar] [CrossRef] [PubMed]
  4. Caillois, R. Man, Play, and Games; Free Press: New York, NY, USA, 1961. [Google Scholar]
  5. Epstein, J. Bacon Brains: Video Games for Teaching the Science of Addiction; Missouri Institute of Mental Health: St. Louis, MO, USA, 2015. [Google Scholar]
  6. Garris, R.; Ahlers, R.; Driskell, J. Games, motivation, and learning: A research and practice model. Simul. Gaming 2002, 33, 441–467. [Google Scholar] [CrossRef]
  7. Johnson, L.; Adams, S.; Estrada, V.; Martín, S. Technology Outlook for STEM+ Education 2013–2018: An NMC Horizon Project Sector Analysis; The New Media Consortium: Austin, TX, USA, 2013. [Google Scholar]
  8. Martin, S.; Lopez-Martin, E.; Moreno-Pulido, A.; Meier, R.; Castro, M. A Comparative Analysis of Worldwide Trends in the Use of Information and Communications Technology in Engineering Education. IEEE Access 2019, 7, 113161–113170. [Google Scholar] [CrossRef]
  9. Kreps, D.M. Game Theory and Economic Modelling; Oxford U Press: Oxford, UK, 1990. [Google Scholar]
  10. Kurzban, R.; Houser, D. Individual differences in cooperation in a circular public goods game. Eur. J. Pers. 2001, 15, S37–S52. [Google Scholar] [CrossRef]
  11. Arnab, S.; Lim, T.; Carvalho, M.B.; Bellotti, F.; de Freitas, S.; Louchart, S.; Suttie, N.; Berta, R.; De Gloria, A. Mapping learning and game mechanics. Br. J. Educ. Technol. 2015, 46, 391–411. [Google Scholar] [CrossRef] [Green Version]
  12. Dawes, R.M. Social dilemmas. Annu. Rev. Psychol. 1980, 31, 169–193. [Google Scholar] [CrossRef]
  13. Mussel, P.; Hewig, J. The life and times of individuals scoring high and low on dispositional greed. J. Res. Personal. 2016, 64, 52–60. [Google Scholar] [CrossRef]
  14. Keil, J.; Michel, A.; Sticca, F.; Leipold, K.; Klein, A.M.; Sierau, S.; Klitzing, K.; White, L.O. The Pizzagame: A virtual public goods game to assess cooperative behavior in children and adolescents. Behav. Res. 2017, 49, 1432–1443. [Google Scholar] [CrossRef] [Green Version]
  15. Theodorou, A.; Bandt-Law, B.; Bryson, J.J. The Sustainability Game: AI Technology as an Intervention for Public Understanding of Cooperative Investment. In Proceedings of the 2019 IEEE Conference on Games (CoG), London, UK, 20–23 August 2019; pp. 1–4. [Google Scholar] [CrossRef]
  16. López, R.; Calvo, J.L.; Ruiz, I.; Martin, S. Are people with high psychoticism the true homo economicus? Personality as a variable to be taken into account in behavioral economics. Int. J. Game Theory 2020, 38. [Google Scholar]
  17. Ulrich, J.F. Cooperative strategies outside the laboratory—Evidence from a long-term large-N-study in five countries. Evol. Hum. Behav. 2017, 38, 109–116. [Google Scholar]
  18. Fischbacher, R.; Gächter, S.; Fehr, E. Are people conditionally cooperative? Evidence from a public goods experiment. Econ. Lett. 2001, 71, 397–404. [Google Scholar] [CrossRef] [Green Version]
  19. Wynne, M.; Hellesoy, A.; Tooke, S. The Cucumber Book: Behaviour-Driven Development for Testers and Developers; Pragmatic Bookshelf: Raleigh, NC, USA, 2017. [Google Scholar]
  20. Gosling, S.D.; Johnson, J.A. Advanced Methods for Behavioral Research on the Internet; American Psychological Association: Washington, DC, USA, 2010. [Google Scholar]
  21. Kevan, J.M.; Ryan, P.R. Experience API: Flexible, decentralized and activity-centric data collection. Technol. Knowl. Learn. 2016, 21, 143–149. [Google Scholar] [CrossRef]
  22. Ruiz, I. BECO Games Source Code. 2019. Available online: https://github.com/ruizrube/becogames (accessed on 12 October 2020).
  23. Ruiz, I. BECO Games Demo Instance. 2019. Available online: http://purl.org/becogames (accessed on 12 October 2020).
  24. Ruiz, I. Common Goods Game Video-Demo Instance. 2020. Available online: https://youtu.be/RTV7MPMGrQk (accessed on 12 October 2020).
  25. Ferguson, R. Learning analytics: Drivers, developments and challenges. Int. J. Technol. Enhanc. Learn. (IJTEL) 2012, 4, 304–317. [Google Scholar] [CrossRef]
  26. Bogarín, A.; Cerezo, R.; Romero, C. A survey on educational process mining. Wiley Interdiscip. Rev. Data Min. Knowl. Discov. 2018, 8, e1230. [Google Scholar] [CrossRef] [Green Version]
  27. Serrano-Laguna, Á.; Martínez-Ortiz, I.; Haag, J.; Regan, D.; Johnson, A.; Fernández-Manjón, B. Applying standards to systematize learning analytics in serious games. Comput. Stand. Interfaces 2017, 50, 116–123. [Google Scholar] [CrossRef] [Green Version]
  28. Calderón, A.; Ruiz, M. A systematic literature review on serious games evaluation: An application to software project management. Comput. Educ. 2015, 87, 396–422. [Google Scholar] [CrossRef]
  29. Gamo, J. Assessing a Virtual Laboratory in Optics as a Complement to On-Site Teaching. IEEE Trans. Educ. 2019, 62, 119–126. [Google Scholar] [CrossRef]
  30. Pérez-Colado, I.J.; Calvo-Morata, A.; Alonso-Fernández, C.; Freire, M.; Martínez-Ortiz, I.; Fernández-Manjón, B. Simva: Simplifying the Scientific Validation of Serious Games. In Proceedings of the 2019 IEEE 19th International Conference on Advanced Learning Technologies (ICALT), Maceió, Brazil, 15–18 July 2019; pp. 113–115. [Google Scholar]
  31. Davis, F.D. User acceptance of information technology: System characteristics user perceptions and behavioral impacts. Int. J. Man Mach. Stud. 1993, 38, 475–487. [Google Scholar] [CrossRef] [Green Version]
  32. Cohen, J. Statistical Power Analysis for the Behavioral Sciences, 2nd ed.; Hillsdale, N.J., Ed.; Lawrence Erlbaum Associates Publishers: Mahwah, NJ, USA, 1988. [Google Scholar]
  33. SonarQube. BECO Games Quality Dashboard. 2020. Available online: http://vedilsanalytics.uca.es/sonarqube/dashboard/index/2009 (accessed on 12 October 2020).
  34. Letouzey, J.L.; Ilkiewicz, M. Managing technical debt with the sqale method. IEEE Softw. 2012, 29, 44–51. [Google Scholar] [CrossRef]
Figure 1. Screen of the BECO Games platform to send joining invitations to the registered users.
Figure 1. Screen of the BECO Games platform to send joining invitations to the registered users.
Sustainability 12 08471 g001
Figure 2. BECO Games invitation screen (student).
Figure 2. BECO Games invitation screen (student).
Sustainability 12 08471 g002
Figure 3. Game view for register investments (left). Screen showing game results according to different scenarios (right).
Figure 3. Game view for register investments (left). Screen showing game results according to different scenarios (right).
Sustainability 12 08471 g003
Figure 4. Snapshot of the Eclipse JUnit plug-in showing the success of the conducted tests.
Figure 4. Snapshot of the Eclipse JUnit plug-in showing the success of the conducted tests.
Sustainability 12 08471 g004
Figure 5. Performance on the final exam in the different academic courses indicating students who participated in the game and those who did not.
Figure 5. Performance on the final exam in the different academic courses indicating students who participated in the game and those who did not.
Sustainability 12 08471 g005
Figure 6. Satisfaction with the game.
Figure 6. Satisfaction with the game.
Sustainability 12 08471 g006
Figure 7. Satisfaction with the educational experience.
Figure 7. Satisfaction with the educational experience.
Sustainability 12 08471 g007
Table 1. Sample size.
Table 1. Sample size.
Participated in the Game2016/2017 Academic Year2017/2018 Academic Year2018/2019 Academic Year
No131 (41.85%)131 (41.46%)92 (40.17%)
Yes182 (58.15%)185 (58.54%)137 (59.83%)
Table 2. Technology Acceptance Model (TAM) survey.
Table 2. Technology Acceptance Model (TAM) survey.
If you played, please rate the following items from 1 (totally disagree) to 6 (totally agree):
  • The topic is interesting to me.
  • The user interface is attractive.
  • The game instructions are easy to follow.
  • The game is easy to play
  • The game flexibility allows me to play whenever I want.
If you played, please rate the following items about the educational experience from 1 (totally disagree) to 6 (totally agree):
  • I think participating in this game has been a good opportunity
  • The game has allowed me to self-assess my knowledge
  • The game has motivated me with the study
  • The game has motivated me with the subject
  • I have enjoyed doing this activity
  • I would like collaborative games to be more present in my studies
  • I am satisfied with the experience
  • I would like to have more experiences like this in this subject
  • I would like to have more experiences like this in other subjects
  • Seeing my position in the ranking stimulates me
  • Reviewing the summary of the game with what I invested and what I earned helps me to better understand the concepts of the subject
  • The game has increased my interest in this field
  • The game has helped me to improve the understanding of the contents of the subject
  • The theoretical concepts of the subject are clarified thanks to the game.
If you played, please rate the following items about your behavior from 1 (totally disagree) to 6 (totally agree):
  • I think I’m good at this game
  • I was wishing that the game would start so I could play
  • Thinking about the contribution has made me hesitate
  • Thinking about the contribution has made me feel anxious
  • I think I played well compared to the rest of my teammates
  • I think my personality has influenced my contribution
  • I have found this experience fun
Table 3. Comparison of students’ performance according to their participation in the game.
Table 3. Comparison of students’ performance according to their participation in the game.
Academic YearParticipated in the GamePerformance (Mean and S.D.)Levene’s TestStudent’s T-DistributionCohen’s D
2016/2017No4489 (2715)21.345 (p. 0.000)−1.936 (p. 0.541)0.232
Yes5032 (2048)
2017/2018No4375 (2368)0.201 (p. 0.654)−1.478 (p. 0.141)0.199
Yes4829 (2220)
2018/2019No3711 (2475)2.959 (p. 0.087)−3.089 (p. 0.002)0.457
Yes4775 (2235)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Martin, S.; Ruiz-Rube, I.; López-Martín, E.; Calvo, J.L.; Lopez, R. Design and Evaluation of a Collaborative Educational Game: BECO Games. Sustainability 2020, 12, 8471. https://0-doi-org.brum.beds.ac.uk/10.3390/su12208471

AMA Style

Martin S, Ruiz-Rube I, López-Martín E, Calvo JL, Lopez R. Design and Evaluation of a Collaborative Educational Game: BECO Games. Sustainability. 2020; 12(20):8471. https://0-doi-org.brum.beds.ac.uk/10.3390/su12208471

Chicago/Turabian Style

Martin, Sergio, Ivan Ruiz-Rube, Esther López-Martín, Jose L. Calvo, and Rafael Lopez. 2020. "Design and Evaluation of a Collaborative Educational Game: BECO Games" Sustainability 12, no. 20: 8471. https://0-doi-org.brum.beds.ac.uk/10.3390/su12208471

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop