Next Article in Journal
Mobile Augmented Reality in Electronic Commerce: Investigating User Perception and Purchase Intent Amongst Educated Young Adults
Next Article in Special Issue
Academic Community in the Face of Emergency Situations: Sense of Responsible Togetherness and Sense of Belonging as Protective Factors against Academic Stress during COVID-19 Outbreak
Previous Article in Journal
Strategies to Assure the Sustainability of Groundwater Resources
Previous Article in Special Issue
The Relationship between Self-Esteem and Achievement Goals in University Students: The Mediating and Moderating Role of Defensive Pessimism
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Differential Efficacy of an Intelligent Tutoring System for University Students: A Case Study with Learning Disabilities

by
Rebeca Cerezo
1,
Maria Esteban
1,*,
Guillermo Vallejo
1,
Miguel Sanchez-Santillan
2 and
Jose Carlos Nuñez
1
1
Psychology Department, University of Oviedo, 33003 Oviedo, Spain
2
Computer Science Department, University of Oviedo, 33003 Oviedo, Spain
*
Author to whom correspondence should be addressed.
Sustainability 2020, 12(21), 9184; https://0-doi-org.brum.beds.ac.uk/10.3390/su12219184
Submission received: 23 September 2020 / Revised: 2 November 2020 / Accepted: 2 November 2020 / Published: 4 November 2020
(This article belongs to the Special Issue Academic Motivation, Performance and Student Well-Being)

Abstract

:
Computer-Based Learning Environments (CBLEs) have emerged as an almost limitless source of education, challenging not only students but also education providers; teaching and learning in these virtual environments requires greater self-regulation of learning. More research is needed in order to assess how self-regulation of learning strategies can contribute to better performance. This study aims to report how an Intelligent Tutoring System can help students both with and without learning difficulties to self-regulate their learning processes. A total of 119 university students with and without learning difficulties took part in an educational experiment; they spent 90 min learning in a CBLE specifically designed to assess and promote self-regulated learning strategies. Results show that as a consequence of the training, the experimental group applied more self-regulation strategies than the control group, not only as a response to a system prompt but also self-initiated. In addition, there were some differences in improvement of learning processes in students with and without learning difficulties. Our results show that when students with learning difficulties have tools that facilitate applying self-regulated learning strategies, they do so even more than students without learning difficulties.

1. Introduction

Computer-Based Learning Environments (CBLEs) have emerged as an ubiquitous source of education, able to overcome the spatiotemporal constrictions of classroom education [1], and institutions of higher education have incorporated CBLEs as a means of expanding their activity [2]. In the beginning, virtual university campuses were complementary to institutional activities, but in recent years, they have become a core component of university work [3]. Nowadays, every university has e-campuses, not only to complement their educational activity, but also to administer relationships between the academic community and as a source of research [4]. Digital literacy has therefore gone from being a helpful skill to being a compulsory requirement for effective participation in higher education [5].
Within this context, diverse learning experiences take place, presenting challenges not only to the students, but also to education providers [6]. CBLEs encompass all those applications and services, based on Advanced Learning Technologies, which use computers as cognitive tools and technology-rich learning environments to facilitate the learning process [7]. However, learning in a CBLE means a great deal of responsibility being placed on the student. In general, students have to decide when, what, and how they learn, and for how long, because they are often asked to complete learning tasks with little or no support. In other words, CBLEs require students to make additional effort to self-regulate their learning [8,9,10,11]. There is abundant empirical evidence suggesting that learners do not successfully adapt their behavior to the self-regulatory demands of CBLE environments [3,12,13,14], and that students lacking self-regulation skills might experience cognitive overload, usability problems, and distractions potentially resulting in lower learning gains [15]. In this scenario, there is no knowledge about the specific difficulties for Students with Learning Disabilities (SLDs) in learning processes involving CBLE, nor specific intervention actions in this sense to reduce the effect of these self-regulatory requirements [16], bearing in mind that those disabilities have some commonality in terms of metacognitive and self-regulatory malfunctioning [17,18,19,20]. This is worth addressing due to the increasing numbers of SLDs who are accessing post-compulsory education (vocational training, university, etc.), and specifically higher education [21,22,23]. It is essential to recognize that if self-regulation is crucial for learning in virtual learning environments for regular students, it is even more important for those students that have some kind of learning disability [24,25]. Research has shown that although they might overcome some of their difficulties, most continue to manifest behaviors characteristic of learning difficulties as adults [26].
In this regard, the pedagogical design of virtual learning environments plays a key role in student performance, but as the European Commission has noted, not all of these environments are properly designed [27]. Research plays a key role in producing standards and guidelines to design effective CBLEs and theoretically grounded and empirically based aids [28].
Within this context, research about self-regulation while learning is one of the most notable tendencies in the field, together with the CBLE design features that can support self-regulated learning processes [15,28,29], apart from SLD.
Research findings highlight the fact that metacognitive and self-regulation skills can be developed by using embedded scaffolding in virtual environments, helping students to maintain motivated and self-regulated behavior [3,30,31]. In this sense, one current line of study is focused on the learning process that takes place in virtual environments where some kind of scaffolding to improve self-regulated learning (SRL) behavior is provided.
Scaffolding consists of providing the students with specific guidelines that, acting in their zone of proximal development [32], can assist them in autonomously doing certain activities in pursuit of a learning goal, making it possible for them to control and regulate their cognitive processes [29]. Scaffolding was the terminology used by Wood, Brunery, and Ross to define support and control over the learning process [33]. Narrowing the subject down to CBLEs, many researchers included scaffolding as a teaching strategy in virtual learning environments and research findings highlight the specific beneficial role of dynamic adaptive scaffolding, as providing more accurately targeted support leads students to more effective use of learning strategies and to better learning gains [9,34,35].
One of the most popular forms of scaffold in Virtual Learning Environments (VLE) is prompts [36]. Prompts are any kind of stimuli presented to the learner to increase the likelihood of producing a response [37]. Prompts are very popular in the SRL research field since they have been proven to be the most effective type of SRL training in CBLEs such as hypermedia, as a systematic review by Devolder, Van Braak, and Tondeur [15] concluded.
In this study, we implemented an Intelligent Tutoring System (ITS) designed to model, trace, and foster students’ self-regulated learning in CBLEs. The system provides conceptual, procedural, metacognitive, and strategic scaffolds through prompts provided by animated pedagogical agents. Pedagogical agents are virtual characters equipped with artificial intelligence in order to facilitate students’ learning processes in CBLEs [38]. They can be in various forms, from 2D characters to 3D human-like characters, with the latter being more effective in promoting learner engagement [39] and included in the current ITS. We used 3D human-like avatar technology and an indirect approach in which pedagogical agents track students’ SRL behavior and provide prompts on this basis [40,41,42].
This study examines the use of SRL strategies in two different groups of students when learning with an ITS: in a normative group of university students where self-regulatory scaffolding is known to be beneficial, and in a university SLD group. Despite the ITS not being specifically designed for this purpose, our intention was to explore the gap of remedial action for this population with the goal of promoting self-regulatory processes in CBLEs. Research has consistently documented SLDs’ deficits in deploying effective self-regulation. In many current reviews of the research in reading, writing, mathematics, and subject-area learning for SLDs [43], authors emphasize how these students struggle not just with “basic processing” problems, but also with higher-order processes involved in self-regulation that are so essential to successful performance. When these students enter higher education, they face significant difficulties; as they are characterized by handicapped executive ability, their normal functioning is impaired for planning, inhibition, and time organization and management [44,45,46]. They exhibit low self-regulation and self-efficacy [47,48,49], apply ineffective learning strategies [50], and have harmful self-perceptions [51,52]. Remedial intervention with this group is possible because research has shown that high achieving disabled students compensate for their difficulties by applying self-regulated learning strategies [53] and could benefit significantly from remedial education provided through CBLEs [54]. Paradoxically, unlike in prior educational levels and in younger students, there is hardly any evidence-based, self-regulatory scaffolding intervention for SLDs when learning in CBLEs [55]; therefore, we consider them to be a group of particular interest for the results of ITS implementation.
At the same time, a recent literature review found that most of the information collected about SLDs in higher education is through interviews and self-report questionnaires [56]. Both techniques, although very valuable, are not sufficient to accurately assess self-regulation. The importance of scales and interview methodology for measuring those processes is undeniable [57], as are the associated problems of validity [57,58] and incongruence with other innovative methods of assessment, such as the method used in this study, designed to assess learning during learning.
Specifically, we intend to respond to the following research questions:
  • Does the Intelligent Tutoring System help students to self-regulate their learning process?
  • Does the Intelligent Tutoring System help SLDs to self-regulate their learning process even though is not specifically designed for this purpose?
  • Additionally, is there any difference between Students with a Learning Disability (SLD) and Students with No Learning Disabilities (SNLD) in terms of the use of SRL strategies during learning with the ITS?
Based on previous research, we hypothesize that the ITS will increase the deployment of SRL strategies, but only in the group of SNLD, due to the additional self-regulatory demands that hypermedia virtual environments involve for students and because SLDs struggle with essential self-regulatory strategies in general.

2. Materials and Methods

2.1. Sample

The research was carried out using an experimental approach. A total of 119 higher education students voluntarily participated in the study. They were randomly assigned to the Experimental Group (N = 59) and the Control Group (N = 60). Almost two-thirds (65.5%) were women, with the remaining 34.5% were men. The mean age of the sample was 23.35 years old (SD = 8.184), and the mean score in the university entrance exam was 8.432 (SD = 1.814). Most students in the sample were first-year undergraduates (58%), but there were also second-year (5.9%), third-year (12.6%), master’s (14.3%), Ph.D. (0.8%), and vocational training students (8.4%). The sample was spread across different knowledge areas: 64.7% corresponding to education, 16.8% to psychology, and 18.5% to a variety of subjects (economics, law, philosophy, nursing, telecommunication, electrical engineering, geomatics, physics, and civil navy).
Since we were interested in the self-regulation of learning in students with learning disabilities, we included a subgroup of SLD (N = 9) to determine their particular profile. They were randomly assigned to the experimental condition (N = 5) or the control condition (N = 4).
These SLD were recruited through the University Office for People with Specific Needs. Their learning disability diagnoses were confirmed using previous clinical reports and an additional assessment protocol that included a structured interview collecting biographical information along with the presence of symptoms related to learning disabilities that are referred to in the DSM-5 [59] (APA, 2013). Following that, we used the reference intellectual ability test WAIS-IV [60] in case we needed to apply exclusion criteria due to low intellectual functioning. Additionally, we used the PROLEC SE-Revised [61] to evaluate reading disabilities. We screened for symptoms of ADHD via the World Health Organization Adult ADHD Self-Report Scale [62], and finally, we included the Autism Spectrum Quotient (AQ-Short) [63] in the protocol, the short version of the reliable AQ-Adult [64].

2.2. Intelligent Tutoring System and Measures

A Spanish adaptation of MetaTutor [65] was used as a hypermedia learning environment [66] and assessment instrument [67,68] in the present study. This system allows observation of users’ deployment of metacognitive and cognitive strategies while they are learning and is part of a new trend in the measurement of SRL—the so called third wave—which is characterized by combined use of measurement and Advanced Learning Technologies [64]. This tool is based on extensive research by Azevedo and colleagues, and overcomes the limitations of self-report methodology, making it possible to detect, model, trace, and foster students’ learning about different science topics [69,70,71,72].
The environment is composed of information in text, charts, and images, in which students have to learn about a complex scientific topic for approximately 90 min. In the beginning, students must set their own learning goals and subgoals out of seven possibilities with the help of a pedagogical agent. Three additional pedagogical agents help students through the learning session. The four agents each have specific tasks: planning, guiding the user through the session, prompting the use of learning strategies, and helping to manage users’ learning subgoals. The system log files record every action of the user’s interaction with the learning environment and the study variables are extracted from these logs.
In this study, we focused on the study of SRL variables grouped in deep SRL strategies and surface SRL strategies. Using this division somewhat oversimplifies matters to two key aspects; however, it has been extensively used as a convenient shorthand in the literature and, in particular, the deep–surface distinction has been found to be important in analyzing learning approaches adopted and learning process outcomes [72]. Based on this distinction and the involvement of metacognition and elaboration in student strategy implementation, we grouped the strategies into deep and surface strategies. Reading, (open and read the content of a page for more than 15 s), re-reading (for example, open and read the same page twice), and taking notes (for instance, read a page, open the note taking tool, and write the main ideas of the text) are considered surface strategies. This is because we considered reading to be a necessary but not sufficient strategy to access learning content, re-reading because it has been shown to have minimal benefit on the text comprehension process, and taking notes because we see in our data that the notes taken are limited to merely copying the learning content that the students were addressing when they took notes. Planning, prior knowledge activation, summarization, content evaluation, coordination of informational resources (text and diagram), inferences, judgment of learning, feeling of knowledge, and monitoring towards goals are considered deep SRL strategies because they all contribute to being strategic about the students’ approach. Those strategies let the student take stock of what they already know, what they need to work on, and how best to approach learning new material. In Table 1, we describe each variable in our learning environment and provide an example:
Additionally, we considered strategies self-initiated and agent-initiated based on who asked to apply the strategy during the learning process; the student alone or the student prompted by the pedagogical agent.

3. Methodology

3.1. Procedure

Participants attended two sessions carried out individually in the educational psychology laboratory.
In the first session, the researcher explained to each participant the ethical and confidentiality aspects of the study and asked them to acknowledge and sign the individual informed consent. In addition, they completed a sociodemographic questionnaire and a pretest about the hypermedia content. Moreover, for the students that suspected or were aware of their LD condition, the learning disabilities assessment protocol described in the sample section was applied.
In the second session, we reminded the participants that the session would last approximately 2 h and that they were going to work in the learning environment while some devices recorded their performance throughout the session. Participants were randomly assigned to the experimental or control conditions and did the learning session followed by the application of the posttest. The learning session lasted between two and three hours since participants had to study for 90 min, but the timer stopped whenever they applied self-regulation strategies. During the learning session, the agents guided the processes for both conditions but only students in the experimental condition received prompts from the agents to use self-regulation strategies and adaptive feedback about it.
As stated before, each of the four pedagogical agents plays a different role during the session: Guille, the guide, informs participants about the system characteristics and interface in order to help them navigate through the learning environment. Guille is also in charge of the administration of pretest and posttest knowledge assessment and self-report measures. Nora, the planner, helps learners to set appropriate subgoals and manage them. Mery, the monitor, helps students to assess their understanding of the content they read during the learning session (for instance, thought-out assessment of learning or feeling that they know expressions). Finally, Ortega, the strategizer, is in charge of supporting students’ use of learning strategies.
In the control condition, the activity of the agents was limited to responding to student-selected actions (depending on the type of action, one or another agent intervenes). In contrast, in the experimental condition, agents appear at the students’ request or on the basis of the adaptive rules embedded in the system. One example of the differences between the control group and experimental group based on the treatment provided is in relation to the Goal Setting strategy; when a student sets a goal related to questions in the pretest in which they had a low score, the system does not correct the control group students, but for the experimental group students, the system suggests working on a goal that corresponds to their lower scores in the pretest. Another example is when a student takes a quick look to monitor their progress towards their goals, the system only provides feedback for the experimental group, not the control group. The descriptive statistics for each group and variable studied are given in Table 2:
These rules provide adaptive scaffolding through the action of the pedagogical agents based on learners’ behavior and responses, since they are designed to scaffold students’ self-regulatory learning processes and understanding of content. In addition, by replying to student responses, the pedagogical agents—in the experimental condition only—provide students with immediate directive feedback about their SRL strategies.
Once a pedagogical agent finishes interacting with the student, it remains visible until a new interaction begins with either the same pedagogical agent or a different one. It is worth noting that learners cannot choose which pedagogical agent they want to interact with, as this is a consequence of the kind of actions they perform.

3.2. Study Design and Data Analysis

In this study, we analyzed the performance of continuous random variables (i.e., self-initiated surface SRL strategies, agent-initiated surface SRL strategies, self-initiated deep SRL strategies, and pedagogical agent-initiated deep SRL strategies) using a multivariate two-way factorial design (people with and without learning difficulties and people with and without prompts).
One of the most commonly used statistical procedures for examining the relationship between several response variables and one or more categorical predictor variables is multivariate analysis of variance (MANOVA). One of the limitations of MANOVA is that the ordinary least squares estimates, and therefore, subsequent hypothesis tests, are very sensitive to deviations of the underlying assumptions of multivariate normality and homogeneity of the variance–covariance matrices. Therefore, before applying MANOVA, we used tests based on multivariate skewness and kurtosis [73] and Box’s M test [74] for homogeneity of covariance matrices to examine the suitability of the analysis. Given that these assumptions were not satisfied by our data and sample sizes were very unequal across groups, we needed some robust procedures for hypothesis testing of the two main effects and the interaction effect on the combination of the four response variables of non-orthogonal multivariate two-way factorial design.
In order to avoid the negative impact of the violation of the assumptions of MANOVA on multivariate test criteria, in this study, we used a multivariate version of the modified Brown–Forsythe (MBF) test statistic developed by Vallejo and Ato [75] to address the question of how to analyze several centroids or multivariate mean vectors. Practical implementation of the MBF procedure requires estimation of the degree of freedom of the approximate central multidimensional Wishart distribution, which can be easily derived by equating the first two moments of the quadratic form associated with the source of variation of interest in the multivariate linear model to those of the central Wishart distribution. In addition, to deal with missing observations in the response variables for one or more subjects, we used the combining rules developed by Vallejo and colleagues [76] for obtaining multiple imputation inferences.
On the other hand, semiparametric methods, such as generalized linear models (GLM), provide an attractive alternative to parametric methods when the response is bounded, data are missing at random, and the assumption of multivariate normality is deemed untenable [77]. A GLM with a binomial error structure was conducted to assess the ability of treatment conditions to predict the proportion of ocular fixations data (proportions being between 0.02 and 0.67). This method allows us to estimate and directly test the effects of the interaction between people with and without learning difficulties and people with and without help pop-ups (prompts).

4. Results

4.1. Preliminary Analyses

Table 3 provides basic descriptive statistics on the following four continuous performance variables about the use of SRL strategies: self-initiated surface STL strategies, agent-initiated surface SRL strategies, self-initiated deep SRL strategies, and agent-initiated deep SRL strategies for the total sample. Table 2 shows descriptive statistics for the SNLD and SLD control and experimental groups. In general, the statistics in these tables suggest that the data collected about the use of SRL strategies may follow a univariate normal distribution. An examination of the skewness and kurtosis statistics indicated that all values were within the range of ±2. In addition, the standard z score for each measure was in the range ±3.5, showed no extreme cases, and indicated no outliers in the data. However, Mardia’s test of multivariate skewness and multivariate kurtosis provided sufficient evidence of deviation from multivariate normality. Although it is not explicitly shown, using Box’s M test, the assumption of multivariate homogeneity must also be rejected.

4.2. Principal Analyses

The results for use of SRL strategies with an SAS/IML program for implementing the MBF procedure are shown in Table 4. Our results showed a marginally significant multivariate main effect between the SLD group and the SNLD group for the four dependent variables when considered simultaneously (F (4, 5.213) = 4.767; p = 0.0551). The definition of marginal significance raises some controversy in the statistical field. We have not been able to find any formal criteria in the literature for when a p value marginal is too large, though in our opinion, a range from p = 0.05 to p = 0.075 is considered acceptable.
We also found that students in the Experimental Group were significantly different from the Control Group in the set of the four dependent variables related to the use of SRL strategies considered simultaneously (F (4, 5.213) = 11.461; p = 0.0087).
However, there was no evidence of a statistically significant multivariate interaction between the two independent variables (F (4, 5.213) = 3.525; p = 0.0956). Only as a trend, the differences between the SLD and SNLD groups tended to be lower in the experimental condition than in the control. From a univariate perspective, the only statistically significant interaction occurred when the dependent variable was agent-initiated surface SRL strategies. Specifically, in the SNLD group, the performance of the subjects receiving prompts was similar to the performance of the subjects not receiving prompts. However, the SLD group exhibited better performance than subjects not receiving prompts.
Table 4 displays the adjusted (for bias) effect size multivariate ¦Ø2 as an effect size estimate with MANOVA. Grissom and Kim explain how to compute the multivariate ¦Ø2 [78]. According to Cohen [79], in this study, the effect size estimates for the effects of interest were moderate (LD: SLD vs. SNLD) to large (Treatment: Control Group vs. Experimental Group).
Since the interaction was not significant, we focused our interpretation on the main effects. Consequently, the next step was univariate MBF analysis to examine whether the change was different for the two main effects (SLD vs. SNLD and Experimental Groups) across each dependent variable of the adjusted additive model.
Table 5 shows that there were significant differences between SLD and SNLD groups in the use of agent-initiated surface SRL strategies (F (1, 5.50) = 7.62; p = 0.0348) and self-initiated deep SRL strategies (F (1, 5.56) = 7.37; p = 0.0376), with a moderate effect size in both cases. However, there were no significant differences between SLD and SNLD groups in the use of self-initiated surface strategies (F (1, 5.71) = 0.046; p = 0.8367) and agent-initiated deep SRL strategies (F (1, 4.33) = 0.035; p = 0.8595). Comparisons between experimental and control groups showed significant differences in the use of agent-initiated surface SRL strategies (F (1, 5.50) = 28.87; p = 0.0022) and agent-initiated deep SRL strategies (F (1, 4.33) = 69.00; p = 0.0008), with a large effect size in both cases. There were no statistically significant differences between experimental and control groups in the use of self-initiated surface SRL strategies (F (1, 5.71) = 0.14; p = 0.7192) and self-initiated deep SRL strategies (F (1, 5.56) = 1.66; p = 0.2481).
Finally, Table 6 shows the GLM results using SAS Proc Glimmix for the variable ocular fixations. The p values of Table 6 show that both the main effects (learning difficulties and treatment groups) and the interaction were not statistically significant at the 0.05 level.

5. Discussion

This study aims to contribute to SRL research through exploring the use of different kinds of SRL strategies in students with different characteristics. Deekens and colleagues concluded in their literature review that the research background of the area has mainly focused on the predictive capacity of these processes in isolation, ignoring their interrelationships [80].
The literature has shown how “the depth of strategy learners employ, ranging from surface strategies (e.g., re-reading) to more complex deep strategies (e.g., knowledge elaboration), is also predictive of learning outcomes across contexts and academic domains” [80] p. 64. Green and colleagues obtained results suggesting that the depth of strategy use predicts different academic results, with poorer results from those corresponding to surface strategy use [81]. Similarly, Dinsmore and Alexander found that applying deeper strategies, such as prior knowledge activation, is more effective than applying surface strategies such as re-reading [82].
The analysis performed for this study examined the use of surface and deep SRL strategies in university students with and without LD. Additionally, we also compared the results from the Experimental Group to the Control Group. The data suggest that the differences we saw in the comparisons between groups (SLD vs. SNLD and EG vs. CG) occurred differently according to the level of the strategies used (Deep vs. Surface), or whether the use of the strategies was with or without ITS scaffolding.
In general terms, we saw that when surface strategies were used, the differences between SLD and SNLD groups were only significant when their use depended on the help of pedagogical agents, but not if they were self-initiated. SNLD deployed more agent-initiated surface strategies (moderate effect). Something similar happened in the comparison between Experimental and Control groups. The differences were only significant when they were related to agent-initiated surface strategies, with the Experimental Group demonstrating a higher rate (large effect).
The picture changed when it came to the use of deep SRL strategies. In this case, the differences were statistically significant, and of moderate size, between SLD and SNLD only when the behavior was self-initiated (without the help of the pedagogical agents), with more deep SRL strategies displayed by the SNLD group. These differences were non-existent when it was the external agent initiating and directing the use of the strategies. This means that the training provided by MetaTutor helps SLD students to develop a study process similar to SNLD in terms of using a high number of deep SRL strategies. In contrast, when the Experimental Group and the Control Group were compared, there were differences only when pedagogical agent help was available (the size of the effect was large), but not when this help was not available. This means that the actions of pedagogical agents are useful in promoting deep SRL strategy use, as the Experimental Group deployed more deep SRL strategies.
Addressing the three initial research questions: First, is the Intelligent Tutoring System helping students to self-regulate their learning process? The results show that students in the experimental condition deploy more SRL strategies. Therefore, the answer to this first research question is yes, the ITS is helping students to self-regulate their learning process. Similar results have been obtained using the same software in different educational contexts (universities in North America) [41,83,84,85,86]. In addition, researchers testing other tools have also verified how SRL training through CBLEs is effective in increasing and improving SRL strategy use [3,87,88,89,90,91,92,93,94].
Our second question was “Is the ITS helping SLD to self-regulate their learning process even though is not specifically designed for this purpose?” The results show how SLDs increase their use of deep SRL strategies when prompted by the pedagogical agents. Therefore, it seems that the answer to this second question is also yes. These results agree with those from Reed and colleagues [51]. Those authors developed a preparation course for freshmen students with and without LD (so that course was not specifically designed for students with LD either) and they compared the results from SLD with those from SNLD. They found that both groups benefited from the intervention, and found increased attentiveness and increased academic and general resourcefulness after the course.
Our last question was “Is there any difference between SLD and SNLD in terms of use of SRL strategies during learning with an Intelligent Tutoring System?” Our findings in this regard were that SLD used major surface but less self-initiated deep SRL strategies than SNLD. Our results agree with findings from Chevalier and colleagues when comparing metacognitive study and learning strategy use with SLD vs. SNLD; those authors found that both groups had different profiles of strategy use, which was predictive of their GPA [53]. SLD demonstrated lower strategy use and lower metacognitive study than SNLD. Similar results were found by Andreanssen and colleagues comparing students with dyslexia to students without it, finding that students with dyslexia used more visual and social strategies—and found them more useful—than students without dyslexia [47]. Nevertheless, in our study, when prompted, SLD used more deep SRL strategies, engaging in a more similar learning process to SNLD. In other studies, SLD have been shown to have learning difficulties such as reading, writing, processing information, and organizing content, etc. Thus, deploying surface learning strategies such as reading or re-reading is not enough to learn particular content [95]. SLD students need to develop other cognitive and metacognitive learning strategies; therefore, an intervention designed to foster deep learning strategies helps SLD to engage in a more complex learning process that leads them towards better academic achievement.
Finally, we must discard our hypothesis; contrary to our expectation that the Intelligent Tutoring System would increase the use of SRL strategies only in the SNLD group, we found that it also increased the use of SRL strategies in the SLD group. In this regard, findings from Reaser and colleagues [96] and Chatzara and colleagues [97] also support the idea that SLD students can greatly benefit from training in the development of SRL processes.

6. Conclusions

We can draw two main conclusions from the results discussed above. On the one hand, the software is effective at providing SRL scaffolding, as it led participants in the experimental condition to use more SRL strategies than participants in the control condition. This agrees with results from other authors [98,99,100] and supports Graesser and colleagues’ conclusion—that to develop effective SRL strategies, most students need some kind of scaffolding [42].
On the other hand, our results agree with other studies showing the suitability of using prompts to support and promote SRL processes in VLEs. Prompts have been proved to be the most effective SRL training method in VLEs [15]. Many other authors have successfully used them in different ways, for instance Moos and Blonde embedding SRL prompts in videos in a flipped classroom context [99]. Other authors, such as Bannert and colleagues, gave students the chance to design their own metacognitive prompts before learning in a hypermedia learning environment [100]. They found that students in the experimental condition visited relevant pages more and devoted more time to those pages than students in the control condition.
In our study, the SRL training was provided by human-like pedagogical agents that led students in the experimental condition to apply more SRL strategies not only as a response to a prompt—but also on their own initiative (self-initiated SRL strategies)—compared with students in the control condition. This agrees with results from Azevedo and colleagues, who found, using a CBLE that supported SRL while learning, that participants in the external regulatory condition applied more, and more diverse, SRL strategies than participants in the SRL condition [28]. Participants in the externally regulated condition were prompted to use SRL strategies, in contrast to the SRL condition where they were not prompted. In addition, Azevedo and colleagues, in a similar study using the same CBLE, showed how participants in the externally regulated condition had greater learning efficacy than participants in the self-regulated condition [40]. Nonetheless, although these kinds of prompts have been shown to be effective, other authors believe that this kind of scaffolding combined with other types—for instance, procedural scaffolds or metacognitive feedback—could lead to better results [15].
To summarize, we can conclude that our intervention seems to be appropriate for SNLD students.
In addition, the evidence presented leads us to state that this kind of learning environment is even more helpful for SLD (second conclusion). Our results show that when SLD have tools that facilitate applying SRL strategies, they do so even more than SNLD. This result is even more significant if we bear in mind the results from Goroshit and colleagues, who—in the context of face to face education—found lower levels of self-regulation in SLD [48]. Along similar lines, Andreanssen and colleagues, in a study with a sample of 34 SLD and 34 SNLD, used web-based diaries to record study activities and SRL behaviors [47]. Their results showed a restricted repertoire of SRL strategies used by SLD compared with SNLD.
Those authors worked in traditional learning environments, which leads us to suspect that CBLEs can be helpful for SLD when they are digitally literate, as in our study, these students increased their use of deep SRL strategies with the help of the pedagogical agents. In this regard, as highlighted by Comby, Standen, and Brown [101], CBLEs have three characteristics that make them particularly suitable for SLD: 1. They allow students to make mistakes without public consequences; 2. Students can manipulate the learning environment in a way that is not possible in traditional learning environments; and 3. The rules of the system can be inferred without symbolic systems.
Other authors have obtained results similar to ours. For instance, Erikson and Larwin carried out a meta-analysis about teaching methods for SLD [102]. Comparing online methods with traditional methods, they found that students’ performance in online learning environments is significantly better than with offline methods. In addition, Chatzara and colleagues looked into the influence of virtual pedagogical agents on SLD [97]. They carried out a study which found that students in the experimental condition had better results in learning assessments than students in the control condition.
Since authors such as Reed and colleagues showed how SLDs often feel unprepared for higher education, remedial actions such as ITSs for promoting SRL are strongly recommended [51].

7. Limitations and Future Research Directions

The results of this study must be considered in light of its limitations, the main one being that the sample of SLD was quite small. To gather the SLD sample, we contacted the University Office for People with Specific Needs. This office collaborated on the project by informing students in their SLD database about the study. In addition, we posted leaflets advertising it in most university buildings. Although several attempts were made to increase the response rate, both samples (SLD and SNLD) were small. Following these attempts, a proposal was made that participation in the study would be in exchange for course credit in two subjects in two different study programs: Educational Psychology in the Teaching program and Social Development in the Psychology program. This allowed us to collect a moderate sample but there were few SLD.
As well as this, it would be very interesting to broaden the variables studied. Motivation, self-efficacy, approach to learning, personal epistemology, prior knowledge about the learning content, and so on, are variables that have shown to play a role in the deploying of SRL strategies. In the near future, we plan to extend the study scope to have a broader perspective of the use of SRL strategies during learning with an ITS.
Our findings have some implications for future research. On the one hand, it is necessary run the same studies with larger, similar samples (both SLD and SNLD), so that the presented results can be compared with a more heterogeneous group, improving generalizability. On the other hand, similar studies are also needed in other contexts, including other academic domains (such as law, languages, and engineering), to assess whether the performance of students is domain-general or domain-specific.

Author Contributions

Conceptualization, J.C.N., R.C., M.E., M.S.-S.; data curation, M.E., M.S.-S.; formal analysis, G.V., J.C.N.; funding acquisition, J.C.N., M.E.G.; investigation, J.C.N., R.C., M.E., M.S.-S.; methodology, J.C.N., G.V.; project administration, J.C.N.; resources J.C.N., G.V., R.C., M.E., M.S.-S.; software, IBM; supervision, J.C.N., R.C.; validation J.C.N., G.V., R.C., M.E., M.S.-S.; visualization: J.C.N,. G.V., R.C., M.E., M.S.-S.; writing the original draft: J.C.N., G.V., R.C., M.E., M.S.-S.; writing—review and editing: R.C., M.E. and J.C.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research was performed thanks to funds from the Spanish Ministry of Education [EDU2014-57571-P]; Spanish Ministry of Economy and Competitiveness [BES-2015-072470]; the Ministry of Sciences and Innovation I+D+i (PID2019-107201GB-100) and the European Union on behalf of the Principality of Asturias [FC-GRUPIN-IDI/2018/000199].

Conflicts of Interest

The authors declare no conflict of interest.

Ethics Statement

This research was carried out bearing in mind the international protocols for scientific research, and in particular, in accordance with the requirements of the Declaration of Helsinki for research with human beings and Organic Law 3/2018, 5th of December, on Protection of Personal Data and ensuring digital rights. In addition, we had the explicit permission of each participant to use their data for scientific research, with their anonymity and confidentiality assured. No ethics committee existed in our university before undertaking the research.

References

  1. Burbules, N. Ubiquitous learning and the future of teaching. Encount. Educ. 2012, 13, 3–14. [Google Scholar] [CrossRef] [Green Version]
  2. Gaebel, M.; Kupriyanova, V.; Morais, R.; Colucci, E. E-Learning in European Higher Education Institutions: Results of a Mapping Survey Conducted in October-December 2013, 1st edition. Eur. Univ. Assoc. 2014, 1–92. [Google Scholar]
  3. Cerezo, R.; Esteban, M.; Sánchez-Santillán, M.; Núñez, J.C. Procrastinating Behavior in Computer-Based Learning Environments to Predict Performance: A Case Study in Moodle. Front. Psychol. 2017, 8, 1403. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Fernández-Pampillón, A.M. Las Plataformas E-Learning Para la Enseñanza y el Aprendizaje Universitario en Internet. In Las Plataformas de Aprendizaje. Del Mito a la Realidad, 1st ed.; López, M.C., Matesanz, M., Eds.; Biblioteca Nueva: Madrid, España, 2009; pp. 45–73. ISBN 978-84-9742-944-3. [Google Scholar]
  5. Parker, D.R.; Banerjee, M. Leveling the digital playing field: Assessing the learning technology needs of college-bound students with LD and/or ADHD. Assess. Eff. Interv. 2007, 33, 5–14. [Google Scholar] [CrossRef]
  6. Chakraborty, M.; Nafukho, F.M. Strategies for Virtual Learning Environments: Focusing on Teaching Presence and Teaching Immediacy. Internet Learn. J. 2015, 4, 8–37. [Google Scholar] [CrossRef]
  7. Martín Hernández, A. Conceptos en La Formación Sin Distancia, 1st ed.; Servicio Público de Empleo Estatal: Madrid, Spain, 2006. [Google Scholar]
  8. Azevedo, R. Defining and measuring engagement and learning in science: Conceptual, theoretical, methodological, and analytical issues. Educ. Psychol. 2015, 50, 84–94. [Google Scholar] [CrossRef]
  9. López, O.; Hederich, C.; Camargo, Á. Logro de aprendizaje en ambientes hipermediales: Andamiaje autorregulador y estilo cognitivo. Revista Latinoamericana de Psicología 2012, 44, 13–26. [Google Scholar]
  10. Järvelä, S.; Hadwin, A.F. New frontiers: Regulating learning in CSCL. Educ. Psychol. 2013, 48, 25–39. [Google Scholar] [CrossRef]
  11. Sánchez-Santillán, M.; Paule-Ruiz, M.; Cerezo, R.; Alvarez-García, V. MeL: Modelo de adaptación dinámica del proceso de aprendizaje en eLearning. Anales de Psicología 2016, 32, 106–114. [Google Scholar] [CrossRef] [Green Version]
  12. Azevedo, R.; Aleven, V. International Handbook of Metacognition and Learning Technologies; Springer: Amsterdam, The Netherlands, 2013; ISBN 978-1-4419-5545-6. [Google Scholar] [CrossRef]
  13. Bogarín, A.; Cerezo, R.; Romero, C. Discovering learning processes using inductive miner: A case study with learning management systems (LMSs). Psicothema 2018, 30, 322–329. [Google Scholar] [CrossRef]
  14. Cerezo, R.; Sánchez-Santillán, M.; Paule-Ruiz, M.P.; Núñez, J.C. Students’ LMS interaction patterns and their relationship with achievement: A case study in higher education. Comp. Educ. 2016, 96, 42–54. [Google Scholar] [CrossRef]
  15. Devolder, A.; van Braak, J.; Tondeur, J. Supporting self-regulated learning in computer-based learning environments: Systematic review of effects of scaffolding in the domain of science education. J. Comp. Assist. Learn. 2012, 28, 557–573. [Google Scholar] [CrossRef]
  16. Gómez, C.; Fernández, M.E.; Cerezo, R.; Núñez, J.C. Learning Disabilities in Higher Education: A Challenge for the University (Dificultades de Aprendizaje en Educación Superior: Un reto Para la Comunidad Universitaria); Publicaciones de la Facultad de Educación y Humanidades del Campus de Melilla: Melilla, España, 2018; pp. e2530–9269. [Google Scholar]
  17. Crane, N.; Zusho, A.; Ding, Y.; Cancelli, A. Domain-specific metacognitive calibration in children with learning disabilities. Contemp. Educ. Psychol. 2017, 50, 72–79. [Google Scholar] [CrossRef]
  18. Harris, K.R.; Reid, R.R.; Graham, S. Self-regulation among students with LD and ADHD. In Learning about learning disabilities, 1st ed.; Wong, B., Ed.; Academic Press: Cambridge, MA, USA, 2004; pp. 167–195. ISBN 9780123884145. [Google Scholar]
  19. National Joint Committee on Learning Disabilities. Collective Perspectives on Issues Affecting Learning Disabilities, 1st ed.; PRO-ED: Austin, TX, USA, 1994. [Google Scholar]
  20. Sawyer, A.C.; Williamson, P.; Young, R. Metacognitive processes in emotion recognition: Are they different in adults with Asperger’s disorder? J. Autism Dev. Disord. 2014, 44, 1373–1382. [Google Scholar] [CrossRef]
  21. Faggella-Luby, M.; Gelbar, N.; Dukes, L., III; Madaus, J.; Lalor, A.; Lombardi, A. Learning Strategy Instruction for College Students with Disabilities: A Systematic Review of the Literature. J. Postsecond. Educ. Disabil. 2019, 32, 63–81. [Google Scholar]
  22. Gokool-Baurhoo, N.; Asghar, A. I can’t tell you what the learning difficulty is: Barriers experienced by college science instructors in teaching and supporting students with learning disabilities. Teach. Teach. Educ. 2019, 79, 17–27. [Google Scholar] [CrossRef]
  23. Hadley, W. Students with Learning Disabilities Transitioning from College: A One-Year Study. Coll. Stud. J. 2018, 52, 421–430. [Google Scholar]
  24. Godovnikova, L.V.; Gerasimova, A.S.; Galchun, Y.V.; Shitikova, E.V. The Competency Levels of Disabled Students Who Study in University. Cypriot J. Educ. Sci. 2019, 14, 99–110. [Google Scholar] [CrossRef] [Green Version]
  25. Rouhani, Y.; Nafchi, A.M.; Ziaee, S.M. Applying different interventions to teach writing to students with disabilities: A review study. Theor. Pract. Lang. Stud. 2016, 6, 733–741. [Google Scholar]
  26. Ferrari, M. A comparative assessment of the cognitive functioning of adults with childhood histories of learning disability and adults with noncognitive disorders. J. Dev. Phys. Disabil. 2009, 21, 355–368. [Google Scholar] [CrossRef]
  27. European Commission. New Modes of Learning and Teaching in Higher Education, 1st ed.; European Union: Luxembourg, Luxembourg, 2014; ISBN 978-92-79-39789-9. [Google Scholar]
  28. Azevedo, R.; Jacobson, M.J. Advances in scaffolding learning with hypertext and hypermedia: A summary and critical analysis. Educ. Technol. Res. Dev. 2008, 56, 93–100. [Google Scholar] [CrossRef]
  29. Huertas, A.; Vesga, G.; Vergara, A.; Romero, M. Effect of a computational scaffolding in the development of secondary students’ metacognitive skills. Int. J. Technol. Enhanc. Learn. 2015, 7, 143–159. [Google Scholar] [CrossRef]
  30. Grothérus, A.; Jeppsson, F.; Samuelsson, J. Formative Scaffolding: How to alter the level and strength of self-efficacy and foster self-regulation in a mathematics test situation. Educ. Act. Res. 2019, 27, 667–690. [Google Scholar] [CrossRef]
  31. Molenaar, I.; Roda, C.; van Boxtel, C.; Sleegers, P. Dynamic scaffolding of socially regulated learning in a computer-based learning environment. Comp. Educ. 2012, 59, 515–523. [Google Scholar] [CrossRef]
  32. Vigotsky, L. Pensamiento y Lenguaje. In Teoría Del Desarrollo Cultural de las Funciones Psíquicas, 1st ed.; Plévade: Buenos Aires, Argentina, 1983; ISBN 9788449329357. [Google Scholar]
  33. Wood, D.; Bruner, J.S.; Ross, G. The role of tutoring in problem solving. J. Child Psychol. Psych. 1976, 17, 89–100. [Google Scholar] [CrossRef]
  34. Poitras, E.; Mayne, Z.; Huang, L.; Udy, L.; Lajoie, S. Scaffolding student teachers’ information-seeking behaviours with a network-based tutoring system. J. Comp. Assist. Learn. 2019, 35, 731–746. [Google Scholar] [CrossRef]
  35. Wu, C.H.; Chen, Y.S.; Chen, T.G. An adaptive e-learning system for enhancing learning performance: Based on dynamic scaffolding theory. Eurasia J. Mathem. Sci. Technol. Educ. 2017, 14, 903–913. [Google Scholar] [CrossRef]
  36. Daumiller, M.; Dresel, M. Supporting self-regulated learning with digital media using motivational regulation and metacognitive prompts. J. Exp. Educ. 2019, 87, 161–176. [Google Scholar] [CrossRef] [Green Version]
  37. Pieger, E.; Bannert, M. Differential effects of students’ self-directed metacognitive prompts. Comp. Human Behav. 2018, 86, 165–173. [Google Scholar] [CrossRef]
  38. Martha, A.S.D.; Santoso, H.B. The Design and Impact of the Pedagogical Agent: A Systematic Literature Review. J. Educ. Online 2019, 16, n1. [Google Scholar] [CrossRef]
  39. Schroeder, N.L.; Traxler, A.L. Humanizing instructional videos in physics: When less is more. J. Sci. Educ. Technol. 2017, 26, 269–278. [Google Scholar] [CrossRef]
  40. Azevedo, R.; Landis, R.S.; Feyzi-Behnagh, R.; Duffy, M.; Trevors, G.; Harley, J.M.; Yeasin, M. The Effectiveness of Pedagogical Agents’ Prompting and Feedback in Facilitating Co-Adapted Learning with MetaTutor. In International Conference on Intelligent Tutoring Systems, S.A.; Cerri, W.J., Clancey, G., Papadourakis, K., Panourgia, K.-K., Eds.; Springer: Berlin, Germany, 2012; pp. 212–221. [Google Scholar] [CrossRef] [Green Version]
  41. Azevedo, R.; Martin, S.A.; Taub, M.; Mudrick, N.V.; Millar, G.C.; Grafsgaard, J.F. Are pedagogical agents’ external regulation effective in fostering learning with intelligent tutoring systems. In International Conference on Intelligent Tutoring Systems; Micarelli, A., Stamper, J., Panourgia, K., Eds.; Springer Cham: Zagreb, Croatia, 2016; pp. 197–207. [Google Scholar] [CrossRef]
  42. Graesser, A.; McNamara, D. Self-regulated learning in learning environments with pedagogical agents that interact in natural language. Educ. Psychol. 2010, 45, 234–244. [Google Scholar] [CrossRef]
  43. Wong, B.Y.L.; Butler, D.L. Learning About Learning Disabilities, 4th ed.; Elsevier Academic Pres: Amsterdam, The Netherlands, 2012; ISBN 9780123884145. [Google Scholar]
  44. Grinblat, N.; Rosenblum, S. Why are they late? Timing abilities and executive control among students with learning disabilities. Res. Dev. Disabil. 2016, 59, 105–114. [Google Scholar] [CrossRef]
  45. Sharfi, K.; Rosenblum, S. Executive Functions, Time Organization and Quality of Life among Adults with Learning Disabilities. PLoS ONE 2016, 11, e0166939. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  46. Roth, R.M.; Isquith, P.K.; Gioia, G.A. Behavior Rating Inventory of Executive Function—Adult Version (BRIEF-A), 1st ed.; Psychological Assessment Resources: Lutz, FL, USA, 2005. [Google Scholar]
  47. Andreassen, R.; Jensen, M.S.; Bråten, I. Investigating self-regulated study strategies among postsecondary students with and without dyslexia: A diary method study. Read. Writ. 2017, 30, 1891–1916. [Google Scholar] [CrossRef]
  48. Goroshit, M.; Hen, M. Academic procrastination and academic performance: Do learning disabilities matter? Curr. Psychol. 2019, 1–9. [Google Scholar] [CrossRef]
  49. Klassen, R.M. Using predictions to learn about the self-efficacy of early adolescents with and without learning disabilities. Contemp. Educ. Psychol. 2007, 32, 173–187. [Google Scholar] [CrossRef]
  50. Heiman, T.; Fichten, C.S.; Olenik-Shemesh, D.; Keshet, N.S.; Jorgensen, M. Access and perceived ICT usability among students with disabilities attending higher education institutions. Educ. Inform. Technol. 2017, 22, 2727–2740. [Google Scholar] [CrossRef]
  51. Reed, M.J.; Kennett, D.J.; Lewis, T.; Lund-Lucas, E. The relative benefits found for students with and without learning disabilities taking a first-year university preparation course. Act. Learn. High. Educ. 2011, 12, 133–142. [Google Scholar] [CrossRef]
  52. Heiman, T.; Precel, K. Students with learning disabilities in higher education: Academic strategies profile. J. Learn. Disabil. 2003, 36, 248–258. [Google Scholar] [CrossRef] [Green Version]
  53. Chevalier, T.M.; Parrila, R.; Ritchie, K.C.; Deacon, S.H. The role of metacognitive reading strategies, metacognitive study and learning strategies, and behavioral study and learning strategies in predicting academic success in students with and without a history of reading difficulties. J. Learn. Disabil. 2017, 50, 34–48. [Google Scholar] [CrossRef]
  54. Brown, D.J.; Standen, P.J.; Proctor, T.; Sterland, D. Advanced design methodologies for the production of virtual learning environments for use by people with learning disabilities. Presence Teleoperat. Virt. Environ. 2001, 10, 401–415. [Google Scholar] [CrossRef]
  55. Lajoie, S.P. Metacognition, self regulation, and self-regulated learning: A rose by any other name? Educ. Psychol. Rev. 2008, 20, 469–475. [Google Scholar] [CrossRef] [Green Version]
  56. Santos, C.G.; Fernández, E.; Cerezo, R.; Núñez, J.C. Dificultades de aprendizaje en Educación Superior: un reto para la comunidad universitaria. Publicaciones 2018, 48, 63–75. [Google Scholar] [CrossRef]
  57. Pike, G.R.; Kuh, G.D. A tipology of student engagement for American colleges and universities. Res. High. Educ. 2005, 46, 185–209. [Google Scholar] [CrossRef]
  58. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders, 5th ed.; American Psychiatric Association: Washington, DC, USA, 2013. [Google Scholar] [CrossRef]
  59. Theiling, J.; Petermann, F. Neuropsychological profiles on the WAIS-IV of adults with ADHD. J. Attent. Disord. 2016, 20, 913–924. [Google Scholar] [CrossRef]
  60. Cuetos, F.; Arribas, D.; Ramos, J.L. Prolec-SE-R, Batería Para la Evaluación de los Procesos Lectores en Secundaria y Bachillerato—Revisada, 1st ed.; TEA: Madrid, Spain, 2016. [Google Scholar]
  61. Kessler, R.C.; Adler, L.; Ames, M.; Demler, O.; Faraone, S.; Hiripi, E.; Howes, M.J.; Jin, R.; Secnik, K.; Spencer, T.; et al. The World Health Organization Adult ADHD Self-Report Scale (ASRS): A short screening scale for use in the general population. Psychol. Med. 2005, 35, 245–256. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  62. Murray, A.L.; Booth, T.; McKenzie, K.; Kuenssberg, R. What range of trait levels can the Autism-Spectrum Quotient (AQ) measure reliably? An item response theory analysis. Psychol. Assessm. 2016, 28, 673–683. [Google Scholar] [CrossRef] [Green Version]
  63. Baron-Cohen, S.; Wheelwright, S.; Skinner, R.; Martin, J.; Clubley, E. The autism-spectrum quotient (AQ): Evidence from asperger syndrome/high-functioning autism, malesand females, scientists and mathematicians. J. Autism Dev. Disord. 2001, 31, 5–17. [Google Scholar] [CrossRef]
  64. Azevedo, R.; Witherspoon, A.; Chauncey, A.; Burkett, C.; Fike, A. MetaTutor: A MetaCognitive Tool for Enhancing Self-Regulated Learning. In 2009 AAAI Fall Symposium Series; AAAI: Arlington, TX, USA, 2009. [Google Scholar]
  65. Azevedo, R.; Johnson, A.; Chauncey, A.; Burkett, C. Self-regulated learning with MetaTutor: Advancing the science of learning with MetaCognitive tools. In New science of learning, 1st ed.; Khine, M., Saleh, I., Eds.; Springer: New York, NY, USA, 2010; pp. 225–247. [Google Scholar] [CrossRef]
  66. Greene, J.A.; Azevedo, R. The measurement of learners’ self-regulated cognitive and metacognitive processes while using computer-based learning environments. Educ. Psychol. 2010, 45, 203–209. [Google Scholar] [CrossRef]
  67. Harley, J.M.; Bouchet, F.; Papaioannou, N.; Carter, C.; Trevors, G.J.; Feyzi-Behnagh, R.; Landis, R.S. Assessing learning with MetaTutor: A Multi-Agent Hypermedia Learning Environment. In Proceedings of the annual meeting of the American Educational Research Association, Philadelphia, PA, USA, 3–7 April 2014. [Google Scholar]
  68. Panadero, E.; Klug, J.; Järvelä, S. Third wave of measurement in the self-regulated learning field: When measurement and intervention come hand in hand. Scand. J. Educ. Res. 2016, 60, 723–735. [Google Scholar] [CrossRef]
  69. Azevedo, R. Theoretical, methodological, and analytical challenges in the research on metacognition and self-regulation: A commentary. Metacognit. Learn. 2009, 4, 87–95. [Google Scholar] [CrossRef]
  70. Feyzi-Behnagh, R.; Trevors, G.; Bouchet, F.; Azevedo, R. Aligning Multiple Sources of SRL Data in MetaTutor: Towards Interactive Scaffolding in Multi-Agent Systems. In Proceedings of the 18th biennial meeting of the European Association for Research on Learning and Instruction (EARLI), Munich, Germany, 27–31 August 2013. [Google Scholar]
  71. Beattie IV, V.; Collins, B.; McInnes, B. Deep and surface learning: A simple or simplistic dichotomy? Account. Educ. 1997, 6, 1–12. [Google Scholar] [CrossRef]
  72. Mardia, K.V. Measures of multivariate skewness and kurtosis with applications. Biometrika 1970, 57, 519–530. [Google Scholar] [CrossRef]
  73. Box, G.E.P. A general distribution theory for a class of likelihood criteria. Biometrika 1949, 36, 317–346. [Google Scholar] [CrossRef]
  74. Vallejo, G.; Ato, M. Robust tests for multivariate factorial designs under heteroscedasticity. Behav. Res. Methods 2012, 44, 471–489. [Google Scholar] [CrossRef] [Green Version]
  75. Vallejo, G.; Fernández, M.P.; Livacic-Rojas, P.E.; Tuero-Herrero, E. Data analysis of incomplete repeated measures using a multivariate extension of the Brown-Forsythe procedure. Psicothema 2018, 30, 434–441. [Google Scholar] [CrossRef]
  76. Vallejo, G.; Fernández, M.P.; Livacic-Rojas, P.E.; Tuero-Herrero, E. Comparison of modern methods for analyzing unbalanced repeated measures data with missing values. Multivar. Behav. Res. 2011, 46, 900–937. [Google Scholar] [CrossRef]
  77. Grissom, R.J.; Kim, J.J. Effect Sizes for Research: Univariate and Multivariate Applications, 2nd ed.; Routledge: New York, NY, USA, 2012. [Google Scholar]
  78. Cohen, J. Statistical Power for the Behavioral Sciences, 2nd ed.; Erlbaum: Hillsdale, NJ, USA, 1988; pp. 1–579. ISBN 0-8058-0283-5. [Google Scholar]
  79. Deekens, V.M.; Greene, J.A.; Lobczowski, N.G. Monitoring and depth of strategy use in computer-based learning environments for science and history. Brit. J. Educ. Psychol. 2018, 88, 63–79. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  80. Greene, J.A.; Bolick, C.M.; Jackson, W.P.; Caprino, A.M.; Oswald, C.; McVea, M. Domainspecificity of self-regulated learning processing in science and history. Contemp. Educ. Psychol. 2015, 42, 111–128. [Google Scholar] [CrossRef] [Green Version]
  81. Dinsmore, D.L.; Alexander, P.A. A multidimensional investigation of deep-level and surface-level processing. J. Exp. Educ. 2016, 84, 213–244. [Google Scholar] [CrossRef]
  82. Bouchet, F.; Harley, J.M.; Azevedo, R. Can Adaptive Pedagogical Agents’ Prompting Strategies Improve Students’ Learning and Self-Regulation. In 13th International Conference on Intelligent Tutoring Systems; Micarelli, A., Stamper, J., Panourgia, K., Eds.; Springer Cham: Zagreb, Croatia, 2016; pp. 368–374. [Google Scholar] [CrossRef]
  83. Duffy, M.C.; Azevedo, R. Motivation matters: Interactions between achievement goals and agent scaffolding for self-regulated learning within an intelligent tutoring system. Comp. Human Behav. 2015, 52, 338–348. [Google Scholar] [CrossRef] [Green Version]
  84. Lallé, S.; Conati, C.; Azevedo, R. Prediction of student achievement goals and emotion valence during interaction with pedagogical agents. In Proceedings of the 17th International Conference on Autonomous Agents and MultiAgent Systems; Andre, E., Koenig, S., Eds.; International Foundation for Autonomous Agents and Multiagent Systems: Ritchland, WA, USA, 2018; pp. 1222–1231. [Google Scholar]
  85. Taub, M.; Mudrick, N.V.; Rajendran, R.; Dong, Y.; Biswas, G.; y Azevedo, R. How Are Students’ Emotions Associated with the Accuracy of Their Note Taking and Summarizing During Learning with ITSs. In International Conference on Intelligent Tutoring Systems; Nkambou, R., Azevedo, R., Vassileva, J., Eds.; Springer Cham: Montreal, Canada, 2018; pp. 233–242. [Google Scholar] [CrossRef]
  86. Carneiro, R.; Simao, A.M.V. Technology Enhanced Environments for Self-Regulated Learning in Teaching Practices. In Self-Regulated Learning in Technology Enhanced Learning Environments, 1st ed.; Brill Sense: Leiden, The Netherlands, 2011; pp. 75–101. ISBN 978-94-6091-654-0. [Google Scholar]
  87. Bellhäuser, H.; Lösch, T.; Winter, C.; Schmitz, B. Applying a web-based training to foster self-regulated learning—Effects of an intervention for large numbers of participants. Int. High. Educ. 2016, 31, 87–100. [Google Scholar] [CrossRef]
  88. Engelmann, K.; Bannert, M. Analyzing temporal data for understanding the learning process induced by metacognitive prompts. Learn. Instr. 2019, 101205. [Google Scholar] [CrossRef]
  89. Jansen, R.S.; van Leeuwen, A.; Janssen, J.; Conijn, R.; Kester, L. Supporting learners’ self-regulated learning in Massive Open Online Courses. Comp. Educ. 2020, 146, 103771. [Google Scholar] [CrossRef]
  90. Imbriale, W.J. Just Text Me: A Self-Regulated Learning Intervention for College Students. Ph.D. Thesis, Michigan State University, East Lansing, MI, USA, 2020. [Google Scholar]
  91. Munshi, A.; Biswas, G. Personalization in OELEs: Developing a Data-Driven Framework to Model and Scaffold SRL Processes. In International Conference on Artificial Intelligence in Education; Isotani, S., Millán, E., Ogan, A., Hastings, P., McLaren, B., Luckin, R., Eds.; Springer Cham: Montreal, Qc, Canada, 2019; pp. 354–358. [Google Scholar] [CrossRef]
  92. Rosário, P.; Núñez, J.C.; Rodríguez, C.; Cerezo, R.; Fernández, E.; Tuero, E.; Högemann, J. Analysis of instructional programs in different academic levels for improving self-regulated learning srl through written text. In Design Principles for Teaching Effective Writing, 1st ed.; Fidalgo, K., Harris, M., Braaksma, R., Eds.; Brill: Leiden, The Netherlands, 2017; pp. 201–231. [Google Scholar] [CrossRef]
  93. Su, J.M. A rule-based self-regulated learning assistance scheme to facilitate personalized learning with adaptive scaffoldings: A case study for learning computer software. Comp. Appl. Eng. Educ. 2020, 28, 536–555. [Google Scholar] [CrossRef]
  94. Steele, M.M. High school students with learning disabilities: Mathematics instruction, study skills, and high stakes tests. Am. Second. Educ. 2010, 21–27. [Google Scholar]
  95. Reaser, A.; Prevatt, F.; Petscher, Y.; Proctor, B. The learning and study strategies of college students with ADHD. Psychol. Schools 2007, 44, 627–638. [Google Scholar] [CrossRef]
  96. Chatzara, K.; Karagiannidis, C.; Stamatis, D. Cognitive support embedded in self-regulated e-learning systems for students with special learning needs. Educ. Inf. Technol. 2016, 21, 283–299. [Google Scholar] [CrossRef]
  97. Bannert, M.; Reimann, P. Supporting self-regulated hypermedia learning through prompts. Instr. Sci. 2012, 40, 193–211. [Google Scholar] [CrossRef]
  98. Moos, D.C.; Bonde, C. Flipping the classroom: Embedding self-regulated learning prompts in videos. Technology, Knowl. Learn. 2016, 21, 225–242. [Google Scholar] [CrossRef]
  99. Núñez, J.C.; Cerezo, R.; Bernardo, A.; Rosário, P.; Valle, A.; Fernández, E.; Suárez, N. Implementation of training programs in self-regulated learning strategies in Moodle format: Results of a experience in higher education. Psicothema 2011, 23, 274–281. [Google Scholar]
  100. Bannert, M.; Sonnenberg, C.; Mengelkamp, C.; Pieger, E. Short-and long-term effects of students’ self-directed metacognitive prompts on navigation behavior and learning performance. Comp. Human Behav. 2015, 52, 293–306. [Google Scholar] [CrossRef]
  101. Cromby, J.J.; Standen, P.J.; Brown, D.J. The potentials of virtual environments in the education and training of people with learning disabilities. J. Intellect. Disabil. Res. 1996, 40, 489–501. [Google Scholar] [CrossRef] [PubMed]
  102. Erickson, M.J.; Larwin, K.H. The Potential Impact of Online/Distance Education for Students with Disabilities in Higher Education. Int. J. Eval. Res. Educ. 2016, 5, 76–81. [Google Scholar]
Table 1. Description and examples of the variables studied.
Table 1. Description and examples of the variables studied.
StrategyDescriptionExample
ReadingVisit or read a page of the hypermedia content.Open a particular page for more than 15 s.
Re-readingRe-reading or revisiting a page in the hypermedia environment.Opened page 43 twice.
PlanningEvery time that the student states learning goals.At the beginning of the learning session and once the student knows the general objective, the student sets two learning goals.
Prior knowledge activationWhen the student searches in their memory for relevant prior knowledge either before beginning task performance or during task performance.The student opens a page and prior to reading, he writes everything he already knows about the topic on that page.
Note-takingWriting down information about a particular page or section of the hypermedia content.While studying the parts of the circulatory system, the student takes notes from this particular page.
SummarizationVerbally restating a synopsis of what was just read, inspected, or heard in the hypermedia environment.After spending time reading the page about the role of the heart in the circulatory system, the user summarizes the reading.
Content evaluationStating that a just-seen text, diagram, or video is either relevant or irrelevant to the active learning goal the student is pursuing.While reading a page about the parts of the circulatory system, the student states whether the current text is appropriate for their current subgoal (malfunctions and illness related to the circulatory system).
Coordination of informational resourcesCoordinating multiple representations through consulting the diagram corresponding to the text information that the student is reading.Spend time studying about the heart and open the associated image.
InferencesEnunciating a conclusion based on two or more pieces of information that were read in the learning environment.After reading about illnesses of the circulatory system, the student concludes that having a heart murmur can be fixed.
Judgement of learningShow that there is (or is not) an understanding of what was read or seen through the use of the learning environment commands.After spending some time on the page about the heartbeat, state whether they have learned that content yet.
Feeling of knowledgeShow that there is an awareness of having (or not having) read or learned something in the past and having some understanding of it through the use of the learning environment commands.Open the page about the heartbeat and, after doing a first reading, the student states whether he knows that content already.
Monitoring towards goalsStating whether the previously set goal has been achieved or not (through the use of the learning environment commands).After spending some time reading the pages related to a particular subgoal, the student assesses whether the current subgoal has been achieved or not.
Table 2. Descriptive statistics for treatment combinations under the four continuous variables of self-regulated learning (SRL) strategies. SLD—students with learning disabilities; SNLD—students with no learning disabilities.
Table 2. Descriptive statistics for treatment combinations under the four continuous variables of self-regulated learning (SRL) strategies. SLD—students with learning disabilities; SNLD—students with no learning disabilities.
SLDSNLD
ControlExperimentalControlExperimental
VariableMeanSDMeanSDMeanSDMeanSD
Self-initiated surface SRL strategies50.0161.2548.8048.8548.8234.0429.8421.38
Agent-initiated surface SRL strategies0.000.001.401.950.000.003.572.25
Self-initiated deep SRL strategies23.0017.5827.009.3534.3816.1143.7219.89
Agent-initiated deep SRL strategies1.250.9527.2013.361.281.5226.028.10
Table 3. Descriptive statistics for four continuous variables of use of SRL strategies.
Table 3. Descriptive statistics for four continuous variables of use of SRL strategies.
VariableNMeanSDSkewnessKurtosis
Self-initiated surface SRL strategies11944.252130.59751.43412.1131
Agent-initiated surface SRL strategies1191.68072.34311.27730.8721
Self-initiated deep SRL strategies11937.924318.86300.91081.7237
Agent-initiated deep SRL strategies11913.596613.86330.4724−1.2947
Table 4. Results of multivariate modified Brown–Forsythe (MBF) analysis for continuous variables of SRL strategies.
Table 4. Results of multivariate modified Brown–Forsythe (MBF) analysis for continuous variables of SRL strategies.
EffectdfNdfDF-ValuePr > FWilks’s ΛPr > FESM
LD45.2134.7670.05510.2140.05510.757
Treatment45.21311.4610.00870.1020.00870.884
LD × Treatment45.2133.5240.09560.2700.0956
Note: LD—learning difficulties; dfN—numerator degrees of freedom; dfD—denominator degrees of freedom; ESM—effect size multivariate.
Table 5. Results of MBF analysis for each of the continuous variables related to use of SRL strategies.
Table 5. Results of MBF analysis for each of the continuous variables related to use of SRL strategies.
EffectdfNdfDF-ValueWilks’s ΛPr > FESU
Self-initiated surface SRL strategies
LD15.7060.0460.9920.8367
Treatment15.7060.1420.9760.7191
Agent-initiated surface SRL strategies
LD15.4957.6200.4140.03480.567
Treatment15.49528.8730.1590.00220.833
Self-initiated deep SRL strategies
LD15.5607.3710.4290.03760.551
Treatment15.5601.6630.7700.2481
Agent-initiated deep SRL strategies
LD14.3340.0350.9960.8595
Treatment14.33469.000.0590.00080.938
Note: dfN—numerator degrees of freedom; dfD—denominator degrees of freedom; ESU—effect size univariate.
Table 6. Results of the generalized linear model (GLM) analysis for the proportion data as the dependent variable.
Table 6. Results of the generalized linear model (GLM) analysis for the proportion data as the dependent variable.
EffectdfNdfDFPr > F
LD14.4120.190.6813
Treatment14.4120.880.3955
LD × Treatment14.4120.200.6869
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Cerezo, R.; Esteban, M.; Vallejo, G.; Sanchez-Santillan, M.; Nuñez, J.C. Differential Efficacy of an Intelligent Tutoring System for University Students: A Case Study with Learning Disabilities. Sustainability 2020, 12, 9184. https://0-doi-org.brum.beds.ac.uk/10.3390/su12219184

AMA Style

Cerezo R, Esteban M, Vallejo G, Sanchez-Santillan M, Nuñez JC. Differential Efficacy of an Intelligent Tutoring System for University Students: A Case Study with Learning Disabilities. Sustainability. 2020; 12(21):9184. https://0-doi-org.brum.beds.ac.uk/10.3390/su12219184

Chicago/Turabian Style

Cerezo, Rebeca, Maria Esteban, Guillermo Vallejo, Miguel Sanchez-Santillan, and Jose Carlos Nuñez. 2020. "Differential Efficacy of an Intelligent Tutoring System for University Students: A Case Study with Learning Disabilities" Sustainability 12, no. 21: 9184. https://0-doi-org.brum.beds.ac.uk/10.3390/su12219184

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop