Next Article in Journal
Adaptive Output Feedback Control for Constrained Switched Systems with Input Quantization
Next Article in Special Issue
Submanifolds of Almost-Complex Metallic Manifolds
Previous Article in Journal
Exploring the Predictors of Co-Nationals’ Preference over Immigrants in Accessing Jobs—Evidence from World Values Survey
Previous Article in Special Issue
On Harmonic Complex Balancing Numbers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Student Engagement with Technology-Enhanced Resources in Mathematics in Higher Education: A Review

by
Caitríona Ní Shé
1,
Eabhnat Ní Fhloinn
2,* and
Ciarán Mac an Bhaird
3
1
Academic Practice, Trinity College Dublin, Dublin, Ireland
2
School of Mathematical Sciences, Dublin City University, Dublin, Ireland
3
Department of Mathematics and Statistics, Maynooth University, Maynooth, Ireland
*
Author to whom correspondence should be addressed.
Submission received: 30 November 2022 / Revised: 26 January 2023 / Accepted: 31 January 2023 / Published: 3 February 2023
(This article belongs to the Special Issue Mathematics and Its Applications in Science and Engineering II)

Abstract

:
The effectiveness of technology-enhanced resources in mathematics in higher education is far from clear, nor is student engagement with such resources. In this review article, we investigate the existing literature in three interrelated areas: student engagement with technology in higher education and mathematics; what works and what does not in technology in education and in mathematics in higher education; evaluating the use of technology in higher education and mathematics; and the use of frameworks and models. Over 300 research articles were identified for this purpose and the results are reported in this review. We found a dearth of studies in undergraduate mathematics education that specifically focus on student engagement with technology. In addition, there is no overarching framework that describes both the pedagogical aspects and the educational context of technology integration in mathematics.

1. Introduction

In higher education, multimedia information can be sourced effortlessly by students. However, despite the pervasiveness of digital technologies in modern society, the level of engagement from students with and the effectiveness of technology-enhanced resources is far from clear [1,2].
Student engagement in higher education is known to be a predictor of successful retention and programme completion [3,4,5]. It is known to be influenced by factors such as the provision of effective resources and supports [6,7,8,9]. It is important to evaluate the effectiveness of resources that are put in place in terms of student engagement.
To facilitate the effective use of technology, it is essential to determine which technology implementations work best and why [10], the “decisive factors” that advance the use of technology-enhanced resources [11]. One barrier to establishing these is the lack of frameworks that can be used to evaluate the use of technology-enhanced resources [12]. The importance of implementing appropriate pedagogical practices when using technology to support learning in mathematics education has been long established [13,14,15]. However, studies reporting on the use of technology do not necessarily examine the effects the particular pedagogical practices have on student engagement with the technology [11,16].
In order to investigate students’ engagement with technology-enhanced resources in mathematics in higher education, we sought to specifically identify research which considers how and in what way students engage with technology and the factors that influence student engagement. These considerations led to the identification of the following three main interrelated areas of the literature that contribute to this research area (RA). These are:
  • RA1: student engagement with technology in higher education (and mathematics).
  • RA2: technology in education and in mathematics in higher education: what works and what does not.
  • RA3: evaluating the use of technology in higher education (and mathematics) and the use of frameworks and models.
Figure 1 shows a conceptual mapping of the three RAs under investigation in this literature review to highlight the overlap between the areas and how they complement each other to provide an overall insight into student engagement with technology in undergraduate mathematics. It displays the number of articles in each RA that were considered in this literature review.
The main focus of this paper is the relevant literature that we identified in these three interrelated areas, with RA1, RA2, and RA3 described in Section 3, Section 4 and Section 5, respectively. Beforehand, in Section 2, we outline the approaches used to conduct the literature review, for example, the review type, how papers were identified, the inclusion and exclusion criteria used, and we outline some limitations. We close, in Section 6, with conclusions which reconsider our findings in the three research areas and the overall contributions of this work.

2. Materials and Methods

The role of a review article is multifaceted. In the first instance, it allows researchers to build an account of the research that has been carried out in the area. This account serves to delimit the research field, identify new areas of research, and support the originality and contribution of future work [17,18,19]. Second, it serves to inform researchers of the theoretical frameworks and research methods that are used in their particular field [17,18,19]. Finally, the review enables researchers to identify the important research, the seminal articles, and the influential researchers in the area [17,18,19].
Because of the multifaceted nature of the review article, it is important that the scope and objectives are well-defined. Effective reviews should adhere to a number of criteria such as those outlined in Cooper’s taxonomy of literature reviews (as cited in [19] (p. 3)), and Boote and Beile’s literature review scoring rubric [17]. The criteria to consider are: the focus and goal of the review, the basis for document inclusion and exclusion, the timeframe reviewed, the sources of the material, and the perspective audience [19]. It is also important to acknowledge the implications of the Hawthorne and novelty effects when gathering and reviewing research studies [20,21]. Research studies that attempt to minimise this, for example by using multi-method and multi-measurement research designs, were located when identifying the sources [20]. Similarly, publication bias can impact on the availability of studies that record no, or detrimental, effects of education interventions [22]. Therefore, articles that reported such outcomes were specifically included in the study.
There are many different forms of review such as narrative, traditional, scoping, methodological, and systematic [23,24,25]. One particular type of narrative review, called a general literature review, is often used for the introduction to a dissertation [25]. This type of review provides the means to analyse relevant and significant aspects of prior research and to identify the gaps that require further research. The general literature review form was used for this paper and it involves “synthesizing primary studies and exploring heterogeneity descriptively, rather than statistically” [26] (p. 19). This process resulted in a body of over 300 articles. Details of the scoping of the literature search and the selection of articles are outlined below.
The initial selection criteria, as outlined in Table 1, and the search terms, outlined in Table 2, were agreed by the three authors in order to identity the literature required to address RA1, RA2, and RA3. This was an iterative process, whereby after an initial scoping by Author 1, all three authors considered the criteria and subsequently agreed upon the final set [23,24,25]. The initial searches and inclusion/exclusion criteria were performed by Author 1. Subsequently, all three authors conferred on the final selection process. For example, all three authors scanned 10 of the same articles found in relation to RA1, looking for articles where the concept, definition, and measurement of student engagement were present, or that examined the effect of student engagement with technology. Subsequently, we conferred on our selections and, once agreed, Author 1 continued and completed the selection process, conferring with Author 2 and Author 3 when there was any doubt.
For RA1, there were 45 articles identified that investigated student engagement with technology (see Table 2), 14 of which related to mathematics learning. For RA2, while there is a significant body of research available on the use of technology in school mathematics, there is a lack of such studies focusing on mathematics in higher education [27,28]. Thus, a body of literature relating to both secondary and higher education was built up. In addition, seminal articles on technology use in higher education were consulted. This resulted in the review of 61 articles (see Table 2). For RA3, the review completed for RA2 was extended to include the terms shown in Table 2. This body of literature and those listed below in Section 4 were examined to address the objective of investigating how the effectiveness of resources has been evaluated.

3. Student Engagement with Technology in Higher Education (and Mathematics)

Over the last twenty years, higher education institutions have increasingly focused their attention on student engagement as an indicator of the quality of their educational offerings [5,29]. This is unsurprising as many studies have shown that student engagement influences student success [3,5,7,16,30,31]. In addition, the use of digital technologies has become more pervasive in society and in education [7,30,32]. Therefore, there is a growing interest in how the use of technology in higher education impacts on student engagement [7,30,33,34,35]. However, specific research into student engagement and technology use is sparse: Schindler found no systematic reviews that considered the association between the two concepts [30]. Many researchers have stated that studies in student engagement are difficult to identify because the construct of engagement is so loosely defined [4,16,36,37]. Further, Trowler explained that studies investigating concepts such as student feedback and approaches to learning were in fact examining engagement, without having identified student engagement as a construct in their investigations [5] (p. 3).
As the main aim of this section is to examine student engagement with technology, studies that examine technology use (and explicitly refer to and define engagement) are reviewed. Studies into the use of technology in undergraduate mathematics education that do not reference engagement are examined in subsequent sections. The following research questions were formulated:
  • What is meant by student engagement with technology and why is it important?
  • In what way(s) has student engagement with technology been measured?
  • What are the factors of implementations that encourage/discourage student engagement with technology?
These questions were then used to examine the body of literature and are answered in the following three subsections.

3.1. What Is Meant by Student Engagement with Technology and Why Is It Important?

While many educational studies have reported on student engagement, there is no single definition of the term to be found in the literature. Despite this, there is a general agreement in the education research literature that the current understanding of the concept of student engagement stems from Astin’s work on student development theory [38] and Fredricks et al.’s seminal paper on school engagement [5,8,30,37,39]. The study of Fredricks et al. recognised that a focus on student engagement posed a remedy for the problem of poor academic motivation and success that was prevalent in schools in the USA [37] (p. 59). In their article, Fredricks et al. acknowledged the difficulty in synthesising research literature on student engagement:
Because there has been considerable research on how students behave, feel, and think, the attempt to conceptualize and examine portions of the literature under the label “engagement” is potentially problematic; it can result in a proliferation of constructs, definitions, and measures of concepts that differ slightly, thereby doing little to improve conceptual clarity [37] (p. 60).
Nonetheless, they found that the literature was focused on constructs which relate to one or other of three types of engagement: behavioural, emotional, and cognitive. Some researchers refer to emotional engagement as affective engagement, with reference to the psychological approach to emotions [8] (p. 761). Fredricks et al. collated and discussed the following definitions from the literature:
  • Behavioural engagement is generally defined in three ways: positive conduct (following rules and guidelines), involvement in learning tasks (effort and persistence), and participation in school-related activities.
  • Emotional engagement refers to students’ affective responses in the classroom such as being bored, sad, anxious, etc., but also students’ sense of belonging.
  • Cognitive engagement is derived from an investment in learning and self-regulation and being strategic when learning [37] (pp. 62–63).
There have been a number of suggestions for further dimensions of engagement, such as agentic and social engagement. Agentic engagement is described as students’ positive input into how their instruction advances [40]. Sinatra et al. described agentic engagement as students’ proactive involvement in their learning environment, whereas the other three engagement dimensions are reactions to the learning environment [31] (p. 3). The final dimension suggested, social engagement, takes into account the increasing role peer and collaborative learning have on education [3].
In higher education, student engagement has been examined by a number of key authors, many of whom go beyond a definition in terms of dimensions and take a more holistic view that includes engagement’s antecedents and outcomes [5,8,9,29,41,42]. The view that student engagement can be defined in terms of the interaction of influencing factors which produce a number of outcomes has gained a consensus in the literature [5,8,16,36,37]. Reflecting on the National Survey of Student Engagement (NSSE), implemented in universities and colleges in Canada and the USA, Kuh defined engagement as:
‘The time and energy students devote to educationally sound activities inside and outside of the classroom, and the policies and practices that institutions use to induce students to take part in these activities.’ [29] (p. 25).
Similarly, in the reviews of the literature on engagement and technology, authors have highlighted the lack of a definition of student engagement with technology [16,30,43,44]. In Yang et al.’s review of the literature on student engagement in online environments [44], they found that only 16 of the 40 studies contained a definition of engagement; these mainly referred to the Fredricks et al. definition [37]. Many of the studies that examine technology and engagement refer to the early work of O’Brien and Toms on analysing user engagement in the context of using a number of web applications [45]. Similar to the holistic view of educational engagement, they proposed that engagement is both a process and a product and that there are certain attributes of a system that influence a user’s engagement with that system [45]. This view is reflected in the definition of engagement, in the context of educational technology, provided by Bond et al.:
Student engagement is the energy and effort that students employ within their learning community, observable via any number of behavioural, cognitive or affective indicators across a continuum. It is shaped by a range of structural and internal influences, including the complex interplay of relationships, learning activities and the learning environment [36] (p. 3).
In this context, it is within the learning activities and environment that the technology with which students engage resides.
The use of engagement as a window into mathematical learning is also growing [46,47,48,49,50,51]. Many of the mathematics education research studies that can be classified as reporting on student engagement focus on cognitive engagement [51]. One of the early studies on engagement in a mathematical classroom defined engagement as ‘the deliberate task-specific thinking that a student undertakes while participating in a classroom activity’ [52] (p. 136). In their study on the effect recorded video lectures had on student engagement, Trenholm et al. used the Skilling et al. definition of engagement: ‘the extent to which students seek deep meaning and understanding as well as the cognitive strategies students use to self-regulate their learning’ [51] (p. 6). Pierce et al. drew the three dimensions of engagement together to focus their attention on cognition while exploring early teenagers’ engagement with a mathematical analysis tool [49]. They examined how ‘students feel about the subject (…affective engagement, AE) and how they behave in learning the subject (… behavioural engagement, BE)’ within a cognitive realm [49] (p. 292).
Students’ views on what constitutes engagement have also been investigated [53]. Students mainly reported engagement in behavioural terms, though a few students referred to the cognitive aspects of engagement. A strong theme that emerged was the association of engagement with the importance of applying theory into practice: ‘Engagement in learning is when you can take the theory and apply it in practice’ [53] (p. 1080).
Student engagement is important not least because it has been linked to academic success. Fredricks et al. claim that all three dimensions of engagement have been shown to impact on student success [37] (pp. 70–71). In an extensive review of published research on engagement, Trowler refers to a number of studies that found that students’ time and effort (or behavioural engagement) impact on their learning, and that ‘observed effects of engagement’ include cognitive development, student satisfaction, and influence on students’ grades [5] (pp. 33–35). Schindler et al. concluded that the use of technology can impact student engagement and emphasised the importance of the effective use of technologies [30]. Henderson et al. suggested that a focus on student engagement can help find which digital technologies work best for students [32]. Student patterns of engagement can be used to examine learning strategies ‘that can be used to inform teaching practice, support interventions, and course learning design’ [54] (p. 59). Bond et al. highlighted the importance of situating individual studies in an overall framework of engagement in order to be able to ‘integrate research findings into practice’ [36] (p. 21).
Within mathematics education research, there is also evidence to suggest that student engagement and the use of technology impacts on learning [3,31,51,52]. Studies on the use of specific technologies in mathematics education have highlighted benefits of student engagement with technologies such as mobile apps [46]; innovative digital tools, i.e., GeoGebra and Desmos [55]; tablets and screencasts [56]; flipped classroom [50]; and online environments [57]. In addition, engagement in mathematics and science has been shown to foster long term participation in STEM [3] (p. 5).
While there is a growing body of research available on the impact of technology on student engagement, there is a degree of uncertainty as to what is meant by student engagement with technology. Student engagement has been shown to be an important construct to measure as it impacts on student success. In the next subsection, student engagement measures used in research studies are examined, which will further illuminate the student engagement concept.

3.2. In What Ways Has Student Engagement with Technology Been Measured?

The complexity around establishing a definition of student engagement means that measuring engagement varies considerably from study to study [31,58]. According to Trowler, the USA and Australia traditionally report on engagement from a different perspective than the UK [5]. In the USA and Australia, research on engagement is often based on outcomes of large-scale student surveys, whereas in the UK, research is rooted in small-scale studies that examine the effects of the particular tools, techniques, and approaches used in teaching [5] (p. 3). These large-scale student surveys, such as the NSSE in the USA and Canada, are generally used to gauge a broad range of engagement indicators, consistent with the view on engagement held by many researchers: that the wider social and institutional interactions and experiences are important components of a holistic approach to engagement [4,5,8,29,36,53]. Indeed, in their seminal work on the characterisation of the dimensions of student engagement, Fredricks et al. refered to both engagement antecedents, such as community culture and educational context [37] (p. 73), and the outcomes of engagement, such as academic achievement [37] (p. 70).
As Trowler said, ‘studies tend to measure that which is measurable’ [5] (p. 17). Within the context of technological interventions, it is the impact on student engagement of the use of technology within the learning environment that is often being measured [36]. In order to understand what exactly is being measured, it is important to focus on how student engagement has been operationalised in research studies on engagement [16]. Henrie et al. and Schindler et al. [30,59], when analysing the literature they had reviewed, did so in terms of the behavioural, emotional, and cognitive indicators of engagement as defined by Fredericks et al. [37]. Likewise, Bond and Bedenlier drew up a table with engagement indicators for each of these dimensions in order to frame their model of student engagement with technology [7] (p. 3). Cognitive engagement indicators include aspects of students’ beliefs about, and attitudes to, learning; behavioural engagement indicators encompass measures such as the time and effort students spend on learning activities; and finally emotional indicators consist of students’ perceptions of their happiness in relation to their learning and the support they receive towards learning [16] (p. 41). Both Henrie et al. and Bond and Bedenlier found that research studies focus mainly on the behavioural aspects of engagement with only a few studies considering either affective (emotional) or cognitive engagement [7,59].
It has been suggested that the use of scales has been effective in measuring the emotional and cognitive effects of engagement that cannot be observed [16,60]. Henrie et al. found that over 60% of the articles they reviewed used a scale or questionnaire to elicit student or teacher perceptions of engagement [59]. However, in line with the difficulty of having a single definition of student engagement, there were 14 different named scales identified in this review [59]. Scales that attempt to measure the broad concept of engagement were found as well as scales that measured a single dimension of engagement [16] (p. 45). One such scale that focusses on the emotional (or affective) impact of technology in the context of secondary school children’s mathematical learning is the Mathematics and Technology Attitude Scale (MTAS) developed by Pierce et al. [49]. Likert scales draw on such indicators, as discussed in the previous paragraph, to help frame the items in the questionnaires [3,41,42,49]. For example, while investigating the use of innovative technologies in undergraduate mathematics, Thomas et al. used engagement subscales that measured: ‘attitude to maths ability; confidence with technology; attitude to instrumental genesis of technology (learning how to use it); attitude to learning mathematics with technology; and attitude to versatile use of technology’ [55] (p. 116).
The observational methods of estimating student engagement are also found in the literature and vary from the notes taken by observers to log data, video, and screen recordings of students’ use of the technology under investigation, the number of posts made to messaging boards, and the time on task [16,44,48,58,61]. The use of log data is generally facilitated through technologies that students use and is often used to measure behavioural engagement indicators such as: the number of clicks on a resource; activity data relating to multiple choice questions; system features used; and the time spent on a task [6,16,44,48,51,62,63]. When using observational data, engagement is often operationalised in terms of verbal utterances such as phrases ‘I am really into this’ [47] (p. 50), or communication of thinking through questions and explanations [52] (p. 136). In the Thomas et al. study on the use of a variety of technologies offered to students, observational notes were used to identify which technology was in use, how it was being used, and who within the group was using it [55]. The advantage of such measures is that they report on engagement as it is happening rather than using self-report measures after the engagement has occurred [16]. Using computer-generated logs also mitigates against the effects of other observational types of measures that may impact on students’ actual engagement [58] (p. 441). However, one of the problems with the use of observational data is the lack of a clear connection between what is being observed and the resultant impact of student engagement inferred [30,31].
Other measures of engagement identified by Henrie and Halverson et al. include interviews, open-ended surveys, academic performance, and the use of physical sensors [16] (p. 44). While many researchers argue that there is a direct link between engagement and academic performance, it is most often used along with other measures, such as rating scales and interviews [46,51,62,64,65,66,67]. When examining students’ use of mobile applications for mathematics, Fabian et al. used pre- and post-tests, a 20-item usability scale, and interviews [46]. Interview data can be useful for inductive analysis, where the nature of student engagement is not predefined [16] (p. 46).
Some of the studies used clearly defined theoretical frameworks to investigate student engagement, such as the use of flow theory when considering gaming in education [41,44,45,58,62,64], and instrumental orchestration to examine students’ cognitive engagement with technology in first-year undergraduate mathematics modules [48,55]. In secondary mathematics education, Attard and Holmes focussed on the pedagogical practices of teachers in terms of their relationships with students and technology, and teachers’ repertory of technological tasks when defining a Framework for Engagement with Mathematics (FEM) [67]. These types of frameworks are considered in more detail later in this paper.
There are difficulties associated with measuring engagement, particularly with the lack of consistent definitions and indicators of engagement. Many educators use variables that are not necessarily true indictors of engagement, but perhaps influence engagement [30] (p. 5). In their paper on the challenges associated with measuring engagement in science, Sinatra et al. highlighted the following challenges: construct definition, grain size of measurement, individual and developmental differences of students, problems with using a single method, the challenge of observing without disturbing the engagement, and problems pinpointing the source of engagement [31] (p. 7). They conclude that ‘researchers should take care to ensure that construct definition drives their choice of measures rather than the selection of measurement determining how engagement is conceptualized in the research’ [31] (p. 7).
Small-scale investigations on student engagement tend to examine factors such as students’ and teachers’ rating of a particular intervention being investigated [5]. To date, a number of factors of technology implementations that impact on this engagement have been found. These are discussed in the next subsection.

3.3. What Are the Factors of Implementations That Encourage/Discourage Student Engagement with Technology?

There are a number of models of student engagement that consider the factors that influence engagement in the overall context of education, which have been discussed in the literature. One of the most cited is Kahu, which was more recently refined in Kahu and Nelson [4,8]. This model maps student engagement within a sociocultural context and contains three main elements: influencing factors; engagement dimensions and their indicators; and a number of short- and long-term outcomes. This model is reproduced in Figure 2 below [4] (p. 64).
This so-called triangle of engagement (influences, contexts, and outcomes) is often found in research on student engagement, though not always as explicitly as in this model [44]. This perspective is in line with the holistic view of engagement taken in the higher education sector [5,8,16,36]. In order to determine the influences and outcomes of technology on student engagement, Bond and Bedenlier drew on the work of Kahu and others to adapt the Bronfenbrenner and Ceci bioecological model (as cited in [7] (p. 4), [8]). In this model, factors affecting student engagement are considered at a number of levels: the macrosystem level contains factors such as the digitisation of education through national policies; at the exosystem level, institutional factors on the use of technology in education are considered; the impact of students’ social and economic background on engagement are contained in the mesosystem level; and finally, the microsystem level contains the more immediate influencing factors such as teachers, peers, and educational technologies [7]. Bond and Bedenlier identified a number of the microsystem level influences, such as the individual students’ and teachers’ acceptance of, and skills in using, technology; the usability and design of the technology-enhanced activities within the curriculum; and the influence of factors such as technical support, the usability of the technology, and the assessment of the learning environment [7].
Many of the influencing factors outlined by Bond and Bedenlier [7] have already been identified in a number of the studies on the use of technology to support student engagement in higher education [30,34,44,53,63,68] and in mathematics education [33,46,50,55,57,69]. Table 3 outlines the factors that impact on student engagement as found in studies that are relevant to undergraduate mathematics.
The nature of self-paced learning, a focus on assessment, and teachers’ use of the technologies are identified as factors that contribute towards student engagement. This view is somewhat consistent with the general mathematics education literature [8,33]. The effective pedagogical use of technology, in the form of appropriate mathematical tasks, has been highlighted as a means to encourage cognitive engagement and develop mathematics learning [33,47,52,67]. Helme and Clarke identified the following influencing factors on primary school students’ cognitive engagement: the classroom environment, the individual, and the mathematical tasks [52].
In addition, Table 3 highlights the variety of measures and indicators used when considering student engagement with mathematics education technology.
While a number of factors that impact student engagement with technology within higher education mathematics have been identified, many of these are outcomes from small-scale studies that do not apply an overarching model of student engagement. In order to effectively use technology to support student engagement with mathematics in higher education, research studies need to be examined under a clearly defined lens of student engagement [7].

3.4. Discussion on Student Engagement

The importance of student engagement in higher education has been well researched and there are many models outlining the influencing factors on, and resultant outcomes of, engagement. Even though there is a lack of a single definition, and many studies do not necessarily provide a definition of engagement, the literature tends to focus on the three dimensions of engagement as defined by Fredricks et al.: cognitive, behavioural, and emotional [37]. While Bond et al. acknowledged that definitions may by necessity vary from one project to the next, they highlighted the importance of providing a definition [37]. Within the body of literature on student engagement with technology, a variety of methods are used to measure engagement, such as questionnaires or scales, observations, interviews, and logged data. Despite the fact that there has been theoretical consideration given to the indicators of measuring engagement [7,16,37], there is often a lack of a clear connection between the measures being used in the studies and engagement indicators [30,31]. Additionally, studies often focus on only one of the three engagement dimensions: cognitive, behavioural, or emotional engagement. It has been shown that all three dimensions of student engagement are important as they each impact on the students’ outcomes. It is important to identify these factors as student engagement is ‘malleable’; thus, targeted ‘interventions’ can be used to increase engagement and, hence, learning [3] (p. 5). By judiciously using technologies, lecturers can exercise some control over their students’ engagement [50].
While factors that encourage student engagement have been identified through the use of models [4,44], those factors that influence engagement with technology are less evident [7]. To address this issue, Bond and Bedenlier defined a model that proposed the influencing factors of technology on student engagement [7]. However, within mathematics education research, a limited number of small-scale studies were found that specifically investigated the intersection of engagement and technology: only five studies merited inclusion in Table 3. These studies identified factors that impact on engagement, such as the affordances of the technology, the pedagogy associated with the use of the tool, and the student’s goal in using the technology.
One of the limitations of this section of the review is that the focus on the intersection of engagement and technology in undergraduate mathematics education yielded few studies. As indicated by Trowler, there are many studies that investigate approaches to teaching and learning that are not flagged as engagement but may in fact measure some of the indicators of engagement. In the next section, a review of the literature on the use of technology in higher education mathematics is undertaken [5].

4. Technology in Education and in Mathematics in Higher Education: What Works and What Does Not

The use of technology in education, and in mathematics education, has been on the increase over the last few decades. This has been evidenced by the volume of literature available that examines how, and to what effect, technology has been used in higher education [32,70,71,72,73] and in mathematics education [27,33,55,74,75]. Some research suggests that student learning is positively impacted by technology use [32], and several studies have investigated the associated student experience [70]. However, there are many who argue that the benefits of technology as a teaching and learning resource within higher education have not been fully investigated or exploited [32,70,76,77,78]. In particular, the optimum approaches for technology implementations require further attention [32,70,77]. The effectiveness of using technology in both higher education and in mathematics education is under question [2,74,79,80]. A 2015 OECD report identified that an increase in the use of computers in mathematics in schools correlated negatively with student performance in mathematics [35]. While this has been echoed in other studies [14,33,81], there are counterclaims. Research reported by Ronau et al. claimed that the use of digital calculators and computer software does improve student understanding [81,82].
The following three research questions were used to examine this literature:
  • What is meant by technology-enhanced resources in undergraduate mathematics education?
  • What are the benefits of using technology-enhanced resources in first-year undergraduate mathematics modules?
  • What factors of the technology-enhanced resource implementations impacted on the associated benefits?
These questions are discussed in the following three subsections.

4.1. What Is Meant by Technology-Enhanced Resources in Undergraduate Mathematics Education?

The terms technology-enhanced resources and technology-enhanced learning are ill-defined in the literature. King et al. highlight that authors use different terminology to refer to educational technology and thus it can be difficult to ensure that authors are discussing the same item [12]. In a review of the higher education literature relating to the use of technology for teaching and learning, Kirkwood and Price examined the types of activities that were considered ‘enhanced’ [83]. They found that technology was used in three ways: to mirror existing teaching, to add to current teaching practice, and to alter the student learning experience and/or teaching practices [83] (p. 11). These findings are similar to the benefits of technology-enhanced learning as outlined by the Higher Education Funding Council for England (HEFCE): efficiency, enhancement and transformation [84]. In Ireland, the National Forum for the Enhancement of Teaching and Learning in Higher Education (NF) conducted a survey of higher education teachers’ use of technology to support their teaching activities [85]. Participants in the survey rated that classroom management activities, or efficiency, were the most important functions of technology [85]. For those who consider that the pedagogical use of technology has not been leveraged to its full in higher education, it is a cause for concern that the main perceived benefit of technology is to promote efficiencies [1,2,70,86]. These educational researchers call on higher education teachers to carefully consider how technology can be integrated into educational activities so that the student learning experience is altered for the better. One way to support the effective pedagogical use of technology is to put an emphasis on the instructional design processes when integrating technology resources [87,88,89]. Such instructional design principles incorporate many aspects of teaching and learning including the need to identify and the associated pedagogical practices to support students in achieving these objectives [88,90,91,92].
While the use of the term “technology-enhanced resource” is also ill-defined in mathematics education research literature, there has been considerable research on how technology resources influence learning in mathematics education [11,93,94,95]. In mathematics educational research, a resource is defined as a tool that helps bring about mathematical understanding, as it allows interaction between mathematical objects and human thinking [95] (p. 2). This concept of a resource as a tool, often called instruments or artefacts, has long been discussed in the context of educational theories such as those put forward by Vygotsky and Leontiev [69,96]. The resultant work has been used in mathematics education research to develop theories on how these tools mediate learning, and thus enhance student understanding of mathematical concepts and enable new ways of working with and understanding mathematics [93,95,96,97,98,99]. In addition, some individual studies have focussed on how learning efficiencies such as students working at their own time and pace, or on how students manage to take ownership of their learning, can be achieved using technology [51,69,94,100,101,102]. Finally, student satisfaction with using technology has been considered in terms of the use of technology to enhance the learning environment [102,103,104,105]. Within the literature on the use of technology in mathematics education, technology-enhanced resources can therefore be described as technology tools that are used to enhance, or better, the mathematical understanding, learning experience and/or learning environment of students engaged in mathematics learning.
In the next subsection, the specific benefits of using technology in undergraduate mathematics, as found in the literature, are examined.

4.2. What Are the Benefits of Using Technology-Enhanced Resources in Undergraduate Mathematics Modules?

Bray and Tangney outline how technology use is seen by many educators as an avenue through which to tackle students’ mathematical understanding [74]. They continue by describing how the computational power, multiple visual representations and diverse ways for students to engage with mathematics through technology are given as reasons for an increase in technology use [74] (p. 256). Educational researchers contend that the affordances of technology, defined as the prescribed, intended, or designed-for use, and possible use [106,107], need to be exploited for successful technology integration in education [108,109]. However, educators argue that the term “affordance” should be used with caution, as it suggests that technology shapes learning without giving due respect to existing teaching and learning practices [78,109]. According to Conole and Dyke [108], technology affordances should include the prescribed, creative, and unintended, educational activities facilitated by technology. In the context of this paper, technology affordances are taken to be the context-based pedagogical benefits that technology can bring to educational activities.
Many researchers in the field of mathematics discuss the uses and benefits of technology in terms of affordances [57,93,110,111,112,113]. There are two distinct affordances that technology can bring to mathematical tasks: ‘pragmatic and epistemic’ [114] (p. 249). Technology brings pragmatic efficiencies by increasing the speed and accuracy of computations and epistemic value when they help advance students’ understanding of mathematical concepts [114] (p. 248). These affordances have been evidenced in the literature on mathematics education technology in higher education [56,80,103,105,115]. In addition, many of these researchers have identified benefits that enhance students’ mathematical learning, that do not necessarily fall under a pragmatic or an epistemic category, such as enhancing the student learning experience [56,103,105]. Table 4 contains a list of the benefits of using technology in mathematics education categorised under the headings of pragmatic, epistemic and other, as found in mathematics education research studies. The studies included in Table 4 were selected based on their relevance to the context of this review. Three of the studies are literature reviews; two situated in higher education mathematics [14,105] and one in general mathematics education [116]. The technology under investigation and the context is also given in the table. Some studies examined multiple benefits. In order to simplify the table, we use the following acronyms: HE M (Higher Education mathematics), 1Y UM (1st Year undergradate mathematics), and 2Y EM (2nd Year Engineering mathematics).
Table 4 lists the benefits associated with using technology; however, many studies also reported negative aspects to technology integration. While the use of screencasts and e-lectures are liked by students, they were found to be associated with both rote and surface approaches to learning, with some evidence of a negative correlation with grades [51,105]. The use of computer-generated feedback is also under question, as this needs to be carefully designed and integrated into the learning process so that students are obliged to engage with the feedback [94,104]. Mathematical discourse is important for students when developing understanding in mathematics and has been found difficult to achieve in online learning environments [104,105]. Finally, Jaworski and Matthews found that any evidence of conceptual understanding gain by using GeoGebra was hard to quantify [118] (p. 183).
In the next subsection, the literature is examined to determine the factors that impact on the benefits or otherwise of the technology implementations discussed.

4.3. What Factors of the Technology-Enhanced Resource Implementations Impacted on the Associated Benefits?

In addition to measuring the benefits or otherwise of using technology in mathematics education, a number of studies investigated factors that impact on successful technology integration. Thomas et al. attributed the positive impact on students’ mathematical understanding and their attitude to, and satisfaction with, the use of technology, to the significant pedagogical changes implemented as part of the study [55]. These pedagogical changes included: teachers designed relevant digital tasks; tools were privileged by the teachers; students were allowed to self-select tools; technology afforded communication between teachers and students; and the use of the digital tools was explicitly linked to the continuous assessment of the modules [55]. The term “Teacher privileging” is used to capture the promotion and use of the digital tool, within a class setting, by the teacher, to guide and develop students’ successful use of the tool [55]. Other studies, such as Jaworski and Matthews and Takači et al. [117,118], were clearly embedded in similar significant pedagogical change, though the former questioned whether increased conceptual understanding had actually occurred. Collaborative or peer learning were specific pedagogical changes identified as factors in both the Thomas et al. and Takači studies [55,117].
Factors of success also featured in the technology affordances. For example, technologies such as CAS can assist with the visualisation of mathematics, allow multiple representations of concepts and facilitate the automated completion of tasks [27,55,117]. Furthermore, technological tools, such as online quizzes, have the capacity to provide instantaneous feedback [104].
Students reported technical, usability and access issues that prevented them using certain technologies [28,56,104,113]. For example, ease of use was a factor that contributed to students selecting Desmos technology over GeoGebra in the Thomas et al. study [55]. While students often rated technology tools as novel, fun, or convenient, it was not always evident that these ratings influenced greater attendance, engagement or grades [103,105,115,118,119].
Similar views are also expressed in a literature review on the use of CAS within higher education. Buteau et al. identified both pedagogical and technical challenges as a barrier to successful CAS integration [27] (p. 61). In addition, students’ educational background impacted on their successful use of CAS (Varavsky, 2012, as cited in [14]). While located in secondary education, Drijvers’s study that examined the factors that supported success is pertinent [11]. He found three such factors: design of the digital technology and the associated tasks and activities; the role of the teacher in synthesising the technology related and other mathematics learning activities; and the educational context.

4.4. Discussion from the Literature on Technology-Enhanced Resource Use in Mathematics Education

There is appreciable discussion in the literature on what constitutes “enhanced” in terms of the use of technology in higher education and whether the benefits of using technology have been fully exploited [1,2,12,70,83,86]. It is also argued that the affordances, or context-based pedagogical benefits, of the technology need to be taken advantage of for successful technology integration [108,109,114]. These pedagogical benefits can be built into the technology resource integration through the use of effective instructional design processes [88,90,91,92]. In mathematics education, technology as a tool to mediate learning has been examined in some detail [11,93,94,95]. There are a limited number of studies that consider enhancement in terms of student satisfaction and self-regulated learning [51,69,100,102,103]. Benefits of using technology that were identified in the literature included: the epistemic benefits associated with mathematical understanding [55,80], the pragmatic advantages of outsourcing computational activities [28,102,117], and other student-centred benefits such as self-regulated learning [51,115,119]. While a number of factors such as the pedagogical changes implemented [55], and the affordances of the technology [27,117], were found to contribute to successful technology integrations, technical challenges and usability issues were identified as barriers [55,113]. In addition, some of the approaches to learning adopted by students as a result of technology integration do not appear to foster deep learning [51,57].
It is interesting to note that a number of these studies (see Table 4), which did not purport to examine student engagement with technology, considered engagement in terms of motivation or satisfaction [56,102,103]. However, it is not always clear whether the technology affordances or the change in pedagogical practices contributed to these benefits [81]. Perhaps, as Trowler suggests, there is a need to establish if some of the indicators of student engagement were examined in these studies [5]. To that end, the methods of evaluation used in the literature are examined in the next section.

5. Evaluating Technology Use in Higher Education (and Mathematics) and the Use of Frameworks and Models

Student engagement has been shown to positively influence student outcomes [7,37], though the intersection of technology and student engagement has not been adequately investigated [7,16]. Evidence exists of the benefits of using technology in small-scale studies (see Table 4). However, the use of technology at scale in undergraduate mathematics remains problematic, in part due to the lack of studies that have demonstrated the benefits technology can bring to this particular student cohort [55]. While factors that contribute to the benefits of using technology have been identified in the previous section, it is not clear how student engagement with technology impacts on the success or otherwise of the implementations. This finding is consistent with the broader literature, where it has been identified that the intersection of student engagement and technology is under-researched [7,16]. Added to this is the fact that studies use a variety of methodologies and frameworks to evaluate the integration of technology in education [11,12,81]. Furthermore, Coupland et al. have called for more empirical evidence on the benefits of using technology, as much of the current literature focusses on students’ and lecturers’ views [33]. They recognise that it is essential to investigate the affordances of technology in terms of student learning, retention and transfer of knowledge, rather than just descriptions and evaluations [33]. King et al. have pointed to the need for frameworks of evaluation that can be used to consistently and comparatively examine how technologies have been successfully integrated into education [12].
Thus, there are two issues to consider here. The first is whether student engagement indicators are used as measures of success in the mathematics education technology literature. To establish this, the studies listed in Table 4 are further explored to determine the methods of evaluation and the indicators used to measure success. Secondly, further examination of the literature is required to establish what frameworks or models are currently used in describing and evaluating technology integration in mathematics and higher education.
The following three research questions were formulated and discussed in the three subsections below.
  • How have the uses of technology-enhanced resources been measured?
  • What models or frameworks are available to classify and evaluate technology-enhanced resource implementations?
  • What features of technology integrations are described/classified within these models and frameworks?

5.1. How Have the Uses of Technology-Enhanced Resources Been Measured?

One of the aims of publishing research on the use of technology in mathematics education is to inform the mathematics education community about practices that have been proven effective so that they can be mirrored in similar contexts [120]. In order to ensure a proven intervention can be scaled, it is important to establish what indicators of success have been used and how they have been measured. In this section, studies that focus on the use of technology within undergraduate mathematics are investigated to establish what indicators of engagement, if any, have already been examined in the literature. With this in mind, the studies referenced in Table 4 are further explored to establish the indicators that were used to measure the benefits of the technologies.
The methodologies and validity varied from study to study. For example, Galligan et al. completed an exploratory study of the integration of technology in first-year undergraduate mathematics, with little detail on how the data was examined and analysed [56]. On the other hand, Jarvis et al. used a case study approach to examine the use of Sage within a mathematics course, and used a thematic approach to analysing interviews [80]. Most of the studies reported, or have evidenced, the use of a mixed methods approach, as can be seen in Table 5, where the different measures for the studies are listed.
For many of the studies, it is not always clear what indicators were used to measure success. While Trenholm et al. [51,94] used proven scales within their surveys, the development of the questions used in surveys was not always evident [28,102,118], though in some cases, there was a clear link to the literature reviewed [103,104,115]. When class observations were used, it was not necessarily clear how the data was interpreted in terms of success or otherwise [103,118].
A number of these measures may also be used to examine student engagement. For example, attendance data and recorded usage of the resources can be used to measure behavioural engagement indicators. Further examination of the inferences made about the recorded use of lectures in the Trenholm et al. study was that it was used to consider approaches to learning (or cognitive engagement) [51]. In contrast, the recorded lecture data used in the Howard et al. study was used to determine students’ perceived value of self-regulated learning (or affective engagement) [119]. Class observations and student interviews may be analysed for indicators of engagement. For example, the King and Robinson study recorded students as saying the ARS technology was fun (associated with affective engagement) [103]; however, they did not examine the impact this had on student engagement. Due to the diversity of the inferences made from the same named measures, and the lack of connection between indicators and student engagement benefits, it is difficult to examine if these studies can contribute to our knowledge on student engagement with technology.
Drijvers suggested that theoretical frameworks are required in order to understand the role of digital technology in mathematics education [11]. Such frameworks can support the evaluation and scaling of technology interventions. Few of the studies explicitly situated their research within theoretical frameworks, but those that did are listed in Table 6, along with the framework used.
In conclusion, it is evident that there is little consistency in the design of research studies on the use of technology in undergraduate mathematics. Hence, it may be difficult to compare the outcomes and come to an understanding of what exactly should be measured. Therefore, it may not always be clear if the technologies can be scaled to be used in different contexts [11]. One way to overcome this is to have frameworks of evaluation that can be used to compare and contrast technology evaluations.

5.2. What Models or Frameworks Are Available to Classify and Evaluate Technology-Enhanced Resource Implementations?

There are a number of issues with the evaluation of technology-enhanced resources within higher education. Amongst these are: the difficulties associated with evaluating this rapidly changing environment, the institutional requirement for cost-effective teaching enhancements, and the lack of appropriate evaluation models or frameworks [12,123]. The importance of frameworks suited for evaluation have been identified by a number of researchers in the field of higher education [12,78,86,109,124] and in mathematics education [11,14]. There are a number of elements of technology integration that need to be considered by these types of frameworks. Firstly, studies should incorporate the types of pedagogy or didactical practices that have been used to integrate technology [81]. Secondly, there needs to be a focus on the types of constructs being measured [124]. Thirdly, the context of the study needs to be taken into account, such as the level of education and student attributes [81,124]. Finally, the affordances of the technology being used need to be made explicit [57,93,110,112,113]. The essential outcome of any evaluation is to establish and explain what technology works under ‘which conditions, for whom and why’ [12].
A considerable number of models, frameworks, categorisations, and typologies were found in the literature on the evaluation or integration of technology in education. For simplicity, these will be generically referred to as the frameworks in this section, although the term used by the authors will be adhered to when discussing specific frameworks. In this literature review on technology education, four loosely aligned groups of frameworks emerged:
  • Technology integration—these frameworks refer to how technology is integrated into teaching and learning.
  • Theoretical frameworks—these are used to examine how learning occurs using technology.
  • Technology affordances and types—these frameworks categorise different technologies according to functionality or affordances the technology supports.
  • User experience frameworks—these refer to how technology is examined from the user’s, or student’s in this case, perspective.
A list of the frameworks examined are contained in Table 7, along with a brief description and/or purpose and an article or website describing their use.
In addition to those listed in Table 7, more generalised frameworks were found that encompass a number of aspects of evaluations and integration [132,148,149,150]. For example, Pickering and Joynes proposed a holistic model of technology-enhanced learning TEL evaluation based on the Kirkpatrick model, one of the most cited models used in the evaluation of training [149]. This model focusses on learner satisfaction, learner gains, and learner and institutional impact, with a view to establishing a cost–benefit analysis [149] (p. 1244). In their literature review on how technology use is evaluated in education, Lai and Bower suggested that education researchers should use the classifications they developed to better focus the design of educational technology research [124]. First, researchers can reflect on which aspects of evaluation and its associated construct they intend to investigate and then select their methodology from the methods and instruments that have already been similarly used [124] (p. 38). Second, they proposed that a generalised model for technology evaluation could be developed based on the themes and subconstructs they identified [124] (p. 38).
As can be seen from Table 7, the frameworks vary in which aspects of technology integration and evaluation are characterised. In the next subsection, the different features categorised by the most relevant frameworks will be considered in more detail.

5.3. What Features of Technology Integrations Are Described/Classified within These Models and Frameworks?

In this section, the frameworks in Table 7 will be examined in more detail in order to elicit which features of technology integration have been classified. Those that were obtained from mathematics education will be discussed first and this will be followed by an examination of other relevant frameworks.

5.3.1. Mathematics Specific Frameworks

Pedagogical Opportunities

Pierce and Stacey examined the use of technology in mathematics education in terms of the pedagogical opportunities that can be supported by the affordances of mathematical analysis software (MAS) [15]. In their pedagogical map, they identified three levels where educational transformation can be enacted by the teacher: mathematical tasks; classroom dynamics and didactical contract; and the subject area, such as mathematical thinking or applications [15] (p. 6). The didactical contract is the set of implicit or explicit responsibilities and commitments that the teacher and student agree to use within the learning environment [151]. Geiger et al. used these three areas to classify the studies they examined in a critical synthesis of research on mathematics educational technology in Australasia [14]. While the pedagogical map was useful, they pointed to areas where it needed to be extended, such as the inclusion of other technology types. Drijvers referred to the benefit of the pedagogical map as a way to define the educational context and mathematical practices of a technology intervention, which are important in determining the success of a technology intervention [11].

Didactical Functions

Drijvers defined pedagogical functionality in terms of the didactical functions [11] (p. 136). In the Drijvers’ model, there are three main didactical functions that are supported by technology: (1) do: the functionality related to doing mathematics, where work that could be done by hand is completed by the technology; (2) learn—practice skills: the functionality provided to practice skills; and (3) learn—concepts: the functionality that supports the development of conceptual understanding [11] (p. 136). Drijvers used this framework to position the pedagogical use of technology in the studies he subsequently examined.

Instrumental Orchestration

Instrumental orchestration is a term that is used to describe how a teacher orchestrates the use of a digital tool. It stems from Artigue’s work on an instrumental approach to using digital tools in mathematics where the technological and conceptual affordances of the tools are exploited to foster mathematical understanding [114]. This theoretical framework has been used in mathematics education research in order to investigate and compare students’ mathematics learning using different technologies and settings [55,97,135].

Didactic Tetrahedron

The Congress of the European Society for Research in Mathematics Education (CERME) group adopted a didactic tetrahedron, inspired by Tall (as cited in [95] (p. 1)), to examine the interactions between teachers, students, knowledge, and tools (resources and technology) [95] (p. 2). (See Figure 3) Cognitive processes are described by the interactions between the technology or resource, knowledge, and the learner (student). The learning theories enacted in the classroom are described by the teacher’s integration of the technology or resource in the classroom and the associated knowledge interactions.

Categories of Tools

Hoyles and Noss identified four categories of tools ‘that distinguish different ways that digital tools have the potential to shape mathematical cognition’ [143]. First are dynamic and graphical tools that allow students to explore mathematical representations from different perspectives. Second, outsourcing processing power allows a machine to take over processing that would previously have been conducted by the student. Third are tools that enable the creation of new mathematical representations and symbols. The final category are tools that allow connectivity and the ability to share mathematics within the community. This framework has since been modified and extended to include newly available digital tools and influenced Bray and Tangney’s work on classifying technology mathematics research studies [74] (p. 259).

Classification System of Research Studies

Bray and Tangney classified the current literature on mathematics education technology in order to provide an overview of the field and enable a comparative analysis of the interventions [74]. The studies were classified into four components described below:
  • Technology which describes the type of technology in use. They used a refinement of the Hoyles and Noss (as cited in [74] (p. 261)) categorisation of tools (described above), which also took into account the types of technology use observed in the literature review. There were seven final classifications within the technology type.
  • Learning theory. Studies were classified according to whether they adopted a Behaviourist, cognitive, constructivist, social constructivist, or constructionist teaching and learning approach.
  • Technology adoption. They used the SAMR model to describe how technology is integrated, because it pertains to the level of technology adoption specific to tasks and activities. This model will be discussed in more detail below.
  • Purpose. Each of the studies was classified based on the aim of the study: for example, to change students’ mathematical attitude, improve performance, or engender collaboration and discussion.
In their analysis of these studies, Bray and Tangney concluded that while tools are increasingly being used to enable visualisations, and to promote collaborative problem-solving, they are not yet transforming the student learning experience [74] (p. 270).

Formative Assessment in Science and Mathematics Education

The FaSMEd project team developed a theoretical framework to characterise the aspects of the classroom use of formative assessment technology tools they developed for post-primary education [126]. The FaSMEd framework consists of three interrelated dimensions developed from the relevant literature and the teams’ educational experience. The three dimensions are:
  • Agents (student, peers, and teacher) that intervene in formative assessment processes in the classroom and that can activate formative assessment strategies.
  • Strategies for formative assessment activated by the agents, based on the work of Wiliam and Thompson [152].
  • Functionalities of technology within the formative assessment processes: sending and displaying; processing and analysing; and providing an interactive environment [126].

Mobile Apps Classifications

Handal et al. examined over a hundred mathematics educational apps while developing a framework for categorising mobile applications [138]. The apps were initially categorised into nine types based on their instructional roles and subsequently clustered into three broad classifications: explorative, productive, and instructive [138]. Explorative apps allow simulations and guided discovery; productive apps enable the student to construct content such as graphs; and instructive apps are generally focussed on drill and practice. These classifications are a modified form of the Goodwin pedagogical classification of tablet apps [138]. Handal et al. added the concept of media richness to describe the ability of the app to provide a ‘high level of problem solving and low prescription’ [138]. Each of the three classifications can have a lower, in-between, or higher level of media richness. For example, guided discovery-type apps which allow ‘exploration and experimentation within a pre-determined framework’ are explorative with a high level of media richness, thus allowing the student a high level of control over the task in hand and requiring a high cognitive investment [138].

Framework for Engagement in Mathematics (FEM)

Attard and Holmes examined how exemplary mathematics teachers take advantage of the affordances of educational technology through the lens of the FEM [67]. According to Attard and Holmes, there are two main factors that encourage student engagement: pedagogical relationships and pedagogical repertoires [67]. They define pedagogical relationships as the educational relationships between students and teachers that support engagement and pedagogical repertories as the routine educational practices used by the teacher [67] (p. 2). This framework outlines a number of elements, such as determining students’ backgrounds and the use of student-centred technology, needed to achieve the required pedagogical relationship and repertories that encourage student engagement with the technology provided [67] (p. 3). These elements were based on the practices of exemplary teachers’ use of technology, and Attard and Holmes conclude that technology used in this way can engage students with mathematics [67] (p. 10).

5.3.2. General Frameworks of Relevance

A number of the other frameworks listed in Table 7 have been used to investigate technology integration in higher education and in mathematics education. The SAMR and TPACK models are described below because of the frequency with which they appear in the mathematics education technology literature. Due to their increasing relevance in educational technology, user experience models and the universal design for learning framework are also described. Furthermore, the aspects of user experience that are traditionally seen in the context of software should be incorporated into the instructional design process design and support the effective integration of technology into education [87,89,153,154].

Substitution Augmentation Modification Redefinition (SAMR) Model

The SAMR model is used to characterise how technology tools are adopted into existing education environments, either through the enhancement or transformation of teaching and learning processes or activities [125]. This model, Figure 4, depicts a hierarchical structure of two broad levels, each with two subcategories [74] (p. 260). The lowest level of integration is to substitute existing activities without making functional changes, followed by augmentation where the technology tools are used to augment existing activities and make functional improvements. At the transformative level, the tasks are either significantly modified through task redesign or the technology allows the redefinition of tasks that enable activities that were previously unavailable.
Bray and Tangney classified mathematics education technology using the SAMR model and found that the majority of tool use fell under the augmentation part of the SAMR model, suggesting that classroom practices are not utilising the affordances of the technologies so that they can transform practice [74] (p. 269).

Technological Pedagogical Content Knowledge (TPACK)

Mishra and Koehler drew on their experiences working in higher education to develop a framework that captured the knowledge required by teachers to effectively integrate technology [130]. This framework, TPACK, has been widely used and/or referred to in research on the integration of technology in mathematics education [75,95,156,157]. The framework highlights the connections between the content, pedagogical, and technological knowledge required by teachers for successful integration of technology in teaching. Mishra and Koehler used the framework in three ways: (1) to investigate teacher knowledge with a view to enhancing it, (2) to apply a pedagogical approach of learning by design to help teachers achieve TPACK, and (3) to guide research and analysis on the effectiveness of pedagogy associated with technology integration [130] (pp. 1019–1020). They conclude that the TPACK framework can help describe, make inferences, and inform, how to apply practices of technology enhanced education.

SAMR vs. TPACK

In a case study, located in a school context, teachers were asked to reflect on their use of technology from both the SAMR model and TPACK framework perspectives [158]. They discussed TPACK in terms of technology integration throughout the year, whereas they reflected on individual activities when discussing SAMR [158] (p. 71). Hilton suggested that the SAMR model is more focussed on student-centred activities whereas TPACK is more aligned with teacher-centred design [158] (p. 72). TPACK has become popular amongst researchers whereas SAMR is more popular amongst practitioners [159] (p. 29).

User Experience Models

In the studies examined in Section 4, there is a lack of focus on the usability of the educational technology as experienced by the students. While many of the studies explored teacher and student views, there are only a few that specifically reference a measure for usability [46,56,103]. It has long been recognised that the usability of educational software needs to be investigated in the context of its use, as opposed to the software as a standalone product [160,161]. Recent investigations by Slade and Downer reveal the importance of the user experience for students when using ePortfolios [162]. Many of the early usability techniques used checklists and rubrics, but these have been proven to be problematic [161] (p. 471). One way to overcome problems with checklists is with the use of heuristic evaluations: ‘Heuristic evaluation is done by experts (in this case, expert teachers) using a set of guidelines, known asheuristics’’ [161] (p. 468). The notion of heuristic evaluation was first introduced by Molich and Nielsen [163], with the associated usability guidelines available on the Nielson Norman group website [164]. Squires and Preece combined these heuristics with the notion of learnability of educational technology to produce a set of ‘learning with software’ heuristics [161] (p. 479). Reeves et al. also used the Nielsen guidelines to define a set of 15 heuristics for eLearning [160].
More recently JISC, the UK digital education organisation, combined the notions of usability and user experience to map out the attributes of educational technology that influence a positive user experience [165]. This framework is based on Morville’s honeycomb (see Figure 5).

Universal Design for Learning (UDL)

Dimensions such as accessibility and findability have become increasingly important in education and are reflected in the Universal Design for Learning (UDL) framework as described by Meyer et al. [166], part of the Centre for Applied Special Technology (CAST) project in the USA [145]. The principles associated with UDL account for the ways in which the different users’ access and use technology. For example, they include the provision of audio files to support learners that are visually impaired. The UDL guidelines [167] suggest there should be multiple means of:
  • Representation, to provide learners with various ways of acquiring information and knowledge;
  • Expression, to provide learners with alternatives for demonstrating what they know;
  • Engagement to tap into learners’ interests, challenge them appropriately, and motivate them to learn.
There is limited research on how the use of the UDL framework has impacted on student engagement with technology. One such study, however, reported that the deliberate design of a first-year undergraduate science module using multiple means of representation, expression, and engagement resulted in a more positive experience for the teacher, despite an increased workload [168] (p. 137). In addition, students were positive about the increase in the control of their learning and in the sense of social presence achieved [168] (p. 138).

5.4. Discussion on Evaluations and Frameworks/Models in Higher Education

An examination of the research methods used in the mathematics educational technology literature has revealed that most studies used mixed methods. A variety of measuring instruments were used such as scales, interviews, and class observations. Indicators of success included the students’ grades, students’ and lecturers’ views on the resources, and the analysis of the curriculum materials (See Table 5). While data, with respect to behavioural engagement, were gathered, it was not always analysed in terms of student engagement [51,103,119]. Thus, it is difficult to establish the factors of technology implementations that impact on student engagement. While the use of frameworks of evaluation are recommended in order to allow for the scaling of implementations, there was limited use of such frameworks found in the literature [11,12].
There are considerable challenges associated with evaluating technology integration [12,123]. The use of frameworks of evaluation can help overcome these challenges [11,14]. Four main categories of framework were found in the literature on educational technology in higher education: technology integration; theoretical frameworks; technology affordances and types; and user experience frameworks. The focus on mathematical understanding in the literature on educational technology in mathematics education is also reflected in the number of frameworks that describe how mathematical learning is achieved using technology as a tool [95,114] and how the pedagogical affordances of technology can be le161veraged [11,15,138,143]. Both the Bray and Tangney’s system of classification and the FaSMEd framework encompass a number of aspects of technology use such as the type of technology, the learning theory used and the level of technology integration [74,126]. None of the mathematical frameworks considered usability or user experience, which is increasingly recognised as a factor in student engagement [34,53], and has been identified as a factor in the success of technology integration in mathematics education [28,46,56,104,113]. While there were a number of frameworks that claimed to describe all aspects of technology education in general [132,149,150], no holistic framework for technology integration was found in the mathematics education literature.

6. Conclusions and Contributions

The main focus of this work was to conduct a literature review which would provide insight into how and in what way students engage with technology and the factors that influence student engagement. This narrative literature review examined the literature on the use of technology to engage students with their learning in undergraduate mathematics modules. However, it only considered articles written in English, and published within the time-frame of 2000–2020. In order to determine what was meant by student engagement, it was necessary to go beyond the scope of mathematics education and review literature in higher education. While the three authors agreed on the papers selected, based on their relevance to the subject area, other research teams may take a different approach. Additionally, only seminal relevant works in post-primary education were reviewed. This literature review was focused on addressing three research areas, which we have labelled RA1, RA2 and RA3, as indicated earlier. This meant that only aspects relating to these areas were examined within the articles. Even with the same selection of papers, other authors might choose to place a different focus on the various papers involved, as is the case in narrative reviews. In RA1, we identified that student engagement has been shown to be an important construct due to its impact on student success [3,31,51,52]. Additionally, related to all three RAs, we uncovered an increassing use of holistic frameworks to examine both the influencing factors and resultant outcomes of student engagement, within the students’ sociocultural context [4,7]. Another contribution of this work across the three RAs is the discovery that, while there is a growing body of research available on the impact of technology on student engagement, tshere is a degree of uncertainty as to what is meant by student engagement with technology and how it should be measured [16,30,36]. In addition, in RA1 and RA3, we discovered that there is a dearth of studies in undergraduate mathematics education that specifically focus on student engagement with technology. Although factors relating to the pedagogical integration of technology have been identified in mathematics education literature [55,56,79], it is significant that there are few studies that examine technology use from the perspective of the student or student engagement. Those that do are mainly concerned with students’ goals [57,69] and do not necessarily consider the impact of the usability and design of the technology on student engagement, factors that have been highlighted in the general education literature [7]. To augment future investigations in these areas, and in order to effectively use technology to support student engagement within higher education, research studies need to be examined under a clearly-defined lens of student engagement [7].
Across RA2 and RA3, we found few studies on education technology which define technology-enhanced resources; however, it should be noted that it is clear from the mathematics literature that such resources can be described as technology tools that are used to enhance, or improve, the mathematical understanding, learning experience and/or learning environment of students engaged in mathematics learning [69,97,98,102]. Specifically, in mathematics education, we found that considerable work has gone into examining the use of technology as a tool to enhance mathematical understanding [11,92,93,94,95]. Other, more pragmatic benefits have been explored to a lesser extent [28,102,117]. It is significant, across all three RAs that, while indicators associated with student engagement were measured, it is not clear how pedagogical changes rather than technology use affected engagement. Only a limited number of studies considered student satisfaction with, and motivations to use, mathematics education technology [48,102,103,104,119]. In relation to all three RAs, this gap in research requires future attention. While satisfaction and motivation are clearly linked to engagement [37], it is not clear what factors impact on student engagement with mathematics education technology. Bond and Bedenlier identified a number of the micro-layer influences on student engagement with technology [7], such as: the individual student and teacher acceptance of, and skills in using, technology; the design of the technology-enhanced activities within the curriculum; and the influence of factors such as technical support, usability of the technology and the assessment on the learning environment. Thus, another contribution of our work is the identification that, in order to examine student engagement with technology in mathematics education, research needs to focus on establishing if these or other factors influence student engagement with technology.
Factors that contribute to the success of technology integration in mathematics education were identified, such as the need for significant pedagogical change and the student’s technical skill in using technology [11,55]. The pedagogical challenges associated with integrating technology require careful consideration, and teachers need support to successfully use technology in mathematics education [27,94,130,169]. Our work has highlighted one way to overcome these challenges through having frameworks available to guide teachers with the integration and evaluation of technology [130,136,169]. In RA3, we considered frameworks found in the mathematics literature that describe various aspects of technology integration, such as those that describe the types of technology in use, how learning is mediated using technology and how technology can be integrated into tasks or settings—see Table 5. However, we have concluded that there is no overarching framework that describes both the pedagogical aspects and the educational context of technology integration. These are all areas in which future work is needed.
In summary, in relation to the principal focus of our work, the first main conclusion is that there is no consistent definition of, or indeed, measure for, student engagement with technology. These results have very important implications for future research in this area. Our second main conclusion is that considerable additional research is required, especially at undergraduate level in mathematics education. This research should, in the first instance, investigate the broad range of factors that impact on student engagement with technology, with an emphasis on exploring the student experience, hearing the student voice, and examining the impact of pedagogical changes. Our final main conclusion is that frameworks can play a crucial role in technology integration, but there is currently no framework that underpins the key facets of educational setting and pedagogy. The development and implementation of such a framework could have a significant impact on the future of technology use in mathematics education. Thus, this literature review opens the way for research into factors that impact student engagement with technology-enhanced resources and the development of an overarching framework for practitioners who are embedding such technologies into their teaching.

Author Contributions

Conceptualization, C.N.S., C.M.a.B. and E.N.F.; methodology, C.N.S., C.M.a.B. and E.N.F.; writing—original draft preparation, C.N.S.; writing—review and editing, C.M.a.B. and E.N.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bayne, S. What’s the matter with “technology enhanced learning”? Learn. Media Technol. 2014, 40, 5–20. [Google Scholar] [CrossRef]
  2. Selwyn, N. Looking beyond learning: Notes towards the critical study of educational technology. J. Comput. Assist. Learn. 2010, 26, 65–73. [Google Scholar] [CrossRef]
  3. Fredricks, J.A.; Wang, M.-T.; Schall Linn, J.; Hofkens, T.L.; Sung, H.; Parr, A.; Allerton, J. Using qualitative methods to develop a survey measure of math and science engagement. Learn. Instr. 2016, 43, 5–15. [Google Scholar] [CrossRef]
  4. Kahu, E.R.; Nelson, K. Student engagement in the educational interface: Understanding the mechanisms of student success. High. Educ. Res. Dev. 2018, 37, 58–71. [Google Scholar] [CrossRef]
  5. Trowler, V. Student Engagement Literature Review; Higher Education Academy: York, UK, 2010; Available online: https://www.advance-he.ac.uk/knowledge-hub/student-engagement-literature-review (accessed on 15 November 2022).
  6. Beer, C.; Clark, K.; Jones, D. Indicators of engagement. In Proceedings of the Sydney 2010 Ascilite Conference, Sydney, Australia, 5–8 December 2010; pp. 75–86. [Google Scholar]
  7. Bond, M.; Bedenlier, S. Facilitating Student Engagement Through Educational Technology: Towards a Conceptual Framework. J. Interact. Media Educ. 2019, 1, 1–14. [Google Scholar] [CrossRef]
  8. Kahu, E.R. Framing student engagement in higher education. Stud. High. Educ. 2013, 38, 758–773. [Google Scholar] [CrossRef]
  9. Zepke, N.; Leach, L. Beyond hard outcomes: ‘Soft’ outcomes and engagement as student success. Teach. High. Educ. 2010, 15, 661–673. [Google Scholar] [CrossRef]
  10. Drijvers, P. Evidence for benefit? Reviewing empirical research on the use of digital tools in mathematics education. In Proceedings of the 13th International Congress on Mathematical Education, Hamburg, Germany, 24–31 July 2016. [Google Scholar]
  11. Drijvers, P. Digital Technology in Mathematics Education: Why It Works (Or Doesn’t). In Selected Regular Lectures from the 12th International Congress on Mathematical Education; Cho, S.J., Ed.; Springer: Cham, Switzerland, 2015; pp. 135–151. [Google Scholar] [CrossRef]
  12. King, M.; Dawson, R.; Batmaz, F.; Rothberg, S. The need for evidence innovation in educational technology evaluation. In Proceedings of the INSPIRE XIX: Global Issues in IT Education, Southampton, UK, 15 April 2014; pp. 9–23. [Google Scholar]
  13. Bray, A.; Tangney, B. Mathematics, technology interventions and pedagogy—Seeing the wood from the trees. In Proceedings of the 5th International Conference on Computer Supported Education, Aachen, Germany, 6–8 May 2013; pp. 57–63. [Google Scholar] [CrossRef]
  14. Geiger, V.; Calder, N.; Tan, H.; Loong, E.; Miller, J.; Larkin, K. Transformations of teaching and learning through digital technologies. In Research in Mathematics Education in Australasia 2012–2015; Makar, K., Dole, S., Visnovska, J., Goos, M., Bennison, A., Fry, K., Eds.; Springer: Singapore, 2016; pp. 255–280. [Google Scholar] [CrossRef] [Green Version]
  15. Pierce, R.; Stacey, K. Mapping Pedagogical Opportunities Provided by Mathematics Analysis Software by mathematics analysis software. Int. J. Comput. Math. Learn. 2010, 15, 1–20. [Google Scholar] [CrossRef]
  16. Henrie, C.R.; Halverson, L.R.; Graham, C.R. Measuring student engagement in technology-mediated learning: A review. Comput. Educ. 2015, 90, 36–53. [Google Scholar] [CrossRef]
  17. Boote, D.N.; Beile, P. Scholars Before Researchers: On the Centrality of the Dissertation Literature Review in Research Preparation. Educ. Res. 2005, 34, 3–15. [Google Scholar] [CrossRef]
  18. Hart, C. Doing a Literature Review: Releasing the Social Science Research Imagination; Sage Publishing: London, UK, 1999. [Google Scholar]
  19. Randolph, J.J. A guide to writing the dissertation literature review. Prac. Assess. Res. Eval. 2009, 14, 13. [Google Scholar] [CrossRef]
  20. Franz, T. Hawthorne Effect. In The SAGE Encyclopaedia of Educational Research, Measurement, and Evaluation; Frey, B.B., Ed.; SAGE Publications, Inc.: Thousand Oaks, CA, USA, 2018; pp. 767–769. [Google Scholar] [CrossRef]
  21. Hochberg, K.; Kuhn, J.; Müller, A. Using smartphones as experimental tools—Effects on interest, curiosity, and learning in physics education. J. Sci. Educ. Technol. 2018, 27, 385–403. [Google Scholar] [CrossRef]
  22. Constantine, N.A. Publication Bias. In Encyclopedia of Epidemiology; Boslaugh, S., Ed.; SAGE Publications: Thousand Oaks, CA, USA, 2012; pp. 854–855. [Google Scholar] [CrossRef]
  23. Baker, J.D. The Purpose, Process, and Methods of Writing a Literature Review. AORN J. 2016, 103, 265–269. [Google Scholar] [CrossRef]
  24. Grant, M.J.; Booth, A. A typology of reviews: An analysis of 14 review types and associated methodologies. Health Inf. Libr. J. 2009, 26, 91–108. [Google Scholar] [CrossRef]
  25. Onwuegbuzie, A.J.; Frels, R. Seven Steps to a Comprehensive Literature Review: A Multimodal and Cultural Approach, 1st ed.; SAGE Publications: London, UK, 2016. [Google Scholar]
  26. Petticrew, M.; Roberts, H. Systematic Reviews in the Social Sciences: A Practical Guide; Blackwell Publishing: Hoboken, NJ, USA, 2006. [Google Scholar] [CrossRef]
  27. Buteau, C.; Marshall, N.; Jarvis, D.; Lavicza, Z. Integrating Computer Algebra Systems in Post-Secondary Mathematics Education: Preliminary Results of a Literature Review. Int. J. Technol. Math. Educ. 2010, 17, 57–68. [Google Scholar]
  28. Lavicza, Z. Integrating technology into mathematics teaching at the university level. ZDM—Int. J. Math. Educ. 2010, 42, 105–119. [Google Scholar] [CrossRef]
  29. Kuh, G.D. What We’re Learning About Student Engagement From NSSE: Benchmarks for Effective Educational Practices. Change Mag. High. Learn. 2003, 35, 24–32. [Google Scholar] [CrossRef]
  30. Schindler, L.A.; Burkholder, G.J.; Morad, O.A.; Marsh, C. Computer-based technology and student engagement: A critical review of the literature. Int. J. Educ. Technol. High. Educ. 2017, 14, 1–28. [Google Scholar] [CrossRef]
  31. Sinatra, G.M.; Heddy, B.C.; Lombardi, D. The Challenges of Defining and Measuring Student Engagement in Science. Educ. Psychol. 2015, 50, 1–13. [Google Scholar] [CrossRef]
  32. Henderson, M.; Selwyn, N.; Aston, R. What works and why? Student perceptions of “useful” digital technology in university teaching and learning. Stud. High. Educ. 2015, 42, 1567–1579. [Google Scholar] [CrossRef]
  33. Coupland, M.; Dunn, P.K.; Galligan, L.; Oates, G.; Trenholm, S. Tertiary mathematics education. In Research in Mathematics Education in Australasia 2012–2015; Maka, K., Dole, S., Visnovska, J., Goos, M., Bennison, A., Fry, K., Eds.; Springer: Singapore, 2016; pp. 187–211. [Google Scholar] [CrossRef]
  34. O’Flaherty, J.; Phillips, C. The use of flipped classrooms in higher education: A scoping review. Internet High. Educ. 2015, 25, 85–95. [Google Scholar] [CrossRef]
  35. OECD. Students, Computers and Learning; Making the Connection; OECD Publishing: Paris, France, 2015. [Google Scholar] [CrossRef]
  36. Bond, M.; Buntins, K.; Bedenlier, S.; Zawacki-Richter, O.; Kerres, M. Mapping research in student engagement and educational technology in higher education: A systematic evidence map. Int. J. Educ. Technol. High. Educ. 2020, 17, 1–30. [Google Scholar] [CrossRef]
  37. Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School Engagement: Potential of the Concept, State of the Evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef] [Green Version]
  38. Astin, A.W. Student involvement: A developmental theory for higher education. J. Coll. Stud. Dev. 1984, 25, 518–529. [Google Scholar]
  39. Coates, H. A model of online and general campus-based student engagement. Assess. Eval. High. Educ. 2007, 32, 121–141. [Google Scholar] [CrossRef]
  40. Reeve, J.; Tseng, C.-M. Agency as a fourth aspect of students’ engagement during learning activities. Contemp. Educ. Psychol. 2011, 36, 257–267. [Google Scholar] [CrossRef]
  41. O’Brien, H.L.; Toms, E.G. The development and evaluation of a survey to measure user engagement. J. Am. Soc. Inf. Sci. Technol. 2010, 61, 50–69. [Google Scholar] [CrossRef]
  42. Coates, H. The value of student engagement for higher education quality assurance. Qual. High. Educ. 2005, 11, 25–36. [Google Scholar] [CrossRef]
  43. Bedenlier, S.; Bond, M.; Buntins, K.; Zawacki-Richter, O.; Kerres, M. Learning by Doing? Reflections on Conducting a Systematic Review in the Field of Educational Technology. In Systematic Reviews in Educational Research; Zawacki-Richter, O., Kerres, M., Bedenlier, S., Bond, M., Buntins, K., Eds.; Springer: Wiesbaden, Germany, 2020; pp. 111–127. [Google Scholar] [CrossRef]
  44. Yang, D.; Lavonen, J.M.; Niemi, H. Online learning engagement: Critical factors and research evidence from literature. Themes Sci. Technol. Educ. 2018, 11, 1–22. [Google Scholar]
  45. O’Brien, H.L.; Toms, E.G. What is User Engagement? A Conceptual Framework for Defining User Engagement with Technology. J. Am. Soc. Inf. Sci. Technol. 2008, 59, 938–955. [Google Scholar] [CrossRef]
  46. Fabian, K.; Topping, K.J.; Barron, I.G. Using mobile technologies for mathematics: Effects on student attitudes and achievement. Educ. Technol. Res. Dev. 2018, 66, 1119–1139. [Google Scholar] [CrossRef]
  47. Lake, E.; Nardi, E. Looking for Goldin: Can adopting student engagement structures reveal engagement structures for teachers? In Proceedings of the Joint Meeting of PME 38 and the PME-NA 36, Vancouver, BC, Canada, 15–20 July 2014; Volume 4, pp. 49–56. [Google Scholar]
  48. Oates, G.; Sheryn, L.; Thomas, M.O.J. Technology-active student engagement in an undergraduate mathematics course. In Proceedings of the Joint Meeting of PME 38 and the PME-NA 36, Vancouver, BC, Canada, 15–20 July 2014; Volume 4, pp. 329–336. [Google Scholar]
  49. Pierce, R.; Stacey, K.; Barkatsas, A. A scale for monitoring students’ attitudes to learning mathematics with technology. Comput. Educ. 2007, 48, 285–300. [Google Scholar] [CrossRef]
  50. Steen-Utheim, A.T.; Foldnes, N. A qualitative investigation of student engagement in a flipped classroom. Teach. High. Educ. 2018, 23, 307–324. [Google Scholar] [CrossRef]
  51. Trenholm, S.; Hajek, B.; Robinson, C.L.; Chinnappan, M.; Albrecht, A.; Ashman, H. Investigating undergraduate mathematics learners’ cognitive engagement with recorded lecture videos. Int. J. Math. Educ. Sci. Technol. 2019, 50, 3–24. [Google Scholar] [CrossRef] [Green Version]
  52. Helme, S.; Clarke, D. Identifying cognitive engagement in the mathematics classroom. Math. Educ. Res. J. 2001, 13, 133–153. [Google Scholar] [CrossRef]
  53. Hong-Meng Tai, J.; Bellingham, R.; Lang, J.; Dawson, P. Student perspectives of engagement in learning in contemporary and digital contexts. High. Edu. Res. Dev. 2019, 38, 1075–1089. [Google Scholar] [CrossRef]
  54. Mirriahi, N.; Jovanovic, J.; Dawson, S.; Gašević, D.; Pardo, A. Identifying engagement patterns with video annotation activities: A case study in professional development. Australas. J. Educ. Technol. 2018, 34, 57–72. [Google Scholar] [CrossRef]
  55. Thomas, M.O.J.; Hong, Y.Y.; Oates, G. Innovative Uses of Digital Technology in Undergraduate Mathematics. In Innovation and Technology Enhancing Mathematics Education: Perspectives in the Digital Era; Faggiano, E., Ferrara, F., Montone, A., Eds.; Springer: Cham, Switzerland, 2017; pp. 109–136. [Google Scholar] [CrossRef]
  56. Galligan, L.; McDonald, C.; Hobohm, C. Conceptualising, implementing and evaluating the use of digital technologies to enhance mathematical understanding: Reflections on an innovation-development cycle. In Educational Developments, Practices and Effectiveness; Lock, J., Redmond, P., Danaher, P.A., Eds.; Palgrave Macmillan: London, UK, 2015; pp. 137–160. [Google Scholar]
  57. Kanwal, S. Exploring Affordances of an Online Environment: A Case-Study of Electronics Engineering Undergraduate Students’ Activity in Mathematics. Int. J. Res. Undergrad. Math. Educ. 2020, 6, 42–64. [Google Scholar] [CrossRef]
  58. Whitton, N.; Moseley, A. Deconstructing Engagement: Rethinking Involvement in Learning. Simul. Gaming 2014, 45, 433–449. [Google Scholar] [CrossRef]
  59. Henrie, C.R.; Bodily, R.; Manwaring, K.C.; Graham, C.R. Exploring intensive longitudinal measures of student engagement in blended learning. Int. Rev. Res. Open Distrib. Learn. 2015, 16, 131–155. [Google Scholar] [CrossRef]
  60. Fredricks, J.A.; McCloskey, W. The measurement of student engagement: A comparatve anallysis of various methods and student self report instruments. In Handbook of Research on Student Engagement; Christenson, S.L., Reschly, A.L., Wylie, C., Eds.; Springer: Boston, MA, USA, 2012; pp. 763–782. [Google Scholar] [CrossRef]
  61. Bulger, M.E.; Mayer, R.E.; Almeroth, K.C.; Blau, S.D. Measuring Learner Engagement in Computer-Equipped College Classrooms. J. Educ. Mult. Hyper. 2008, 17, 129–143. [Google Scholar]
  62. Beatson, N.; Gabriel, C.-A.; Howell, A.; Scott, S.; Van Der Meer, J.; Wood, L.C. Just opt in: How choosing to engage with technology impacts business students’ academic performance. J. Account. Educ. 2019, 50, 100641. [Google Scholar] [CrossRef]
  63. Cruz-Benito, J.; Therón, R.; García-Peñalvo, F.J.; Pizarro Lucas, E. Discovering usage behaviors and engagement in an Educational Virtual World. Comput. Hum. Behav. 2015, 47, 18–25. [Google Scholar] [CrossRef]
  64. Al-Sakkaf, A.; Omar, M.; Ahmad, M. A systematic literature review of student engagement in software visualization: A theoretical perspective. Comput. Sci. Educ. 2019, 29, 283–309. [Google Scholar] [CrossRef]
  65. McMullen, S.; Oates, G.; Thomas, M.O.J. An integrated technology course at university: Orchestration and mediation. In Proceedings of the 39th Conference of the International Group for the Psychology of Mathematics Education, Hobart, Australia, 13–18 July 2015; Volume 1, pp. 249–257. [Google Scholar]
  66. Pardos, Z.A.; Baker, R.S.J.; San Pedro, M.; Gowda, S.M.; Gowda, S.M. Affective States and State Tests: Investigating How Affect and Engagement during the School Year Predict End-of-Year Learning Outcomes. J. Learn. Anal. 2014, 1, 107–128. [Google Scholar] [CrossRef]
  67. Attard, C.; Holmes, K. “It gives you that sense of hope”: An exploration of technology use to mediate student engagement with mathematics. Heliyon 2020, 6, e02945. [Google Scholar] [CrossRef]
  68. Lai, C.; Wang, Q.; Lei, J. What factors predict undergraduate students’ use of technology for learning? A case from Hong Kong. Comput. Educ. 2012, 59, 569–579. [Google Scholar] [CrossRef]
  69. Anastasakis, M.; Robinson, C.L.; Lerman, S. Links between students’ goals and their choice of educational resources in undergraduate mathematics. Teach. Math. Applic. 2017, 36, 67–80. [Google Scholar] [CrossRef]
  70. Conole, G.; Alevizou, P. A Literature Review of the Use of Web 2.0 Tools in Higher Education Table of Contents; Higher Education Academy: York, UK, 2010. [Google Scholar]
  71. Englund, C.; Olofsson, A.D.; Price, L. Teaching with technology in higher education: Understanding conceptual change and development in practice. High. Educ. Res. Dev. 2017, 36, 73–87. [Google Scholar] [CrossRef]
  72. Price, L.; Kirkwood, A. Enhancing Professional Learning and Teaching through Technology: A Synthesis of Evidence-Based Practice among Teachers in Higher Education; Higher Education Academy: York, UK, 2011; Available online: http://oro.open.ac.uk/30686/ (accessed on 15 November 2022).
  73. Selwyn, N. Editorial: In praise of pessimism-the need for negativity in educational technology. Br. J. Educ. Technol. 2011, 42, 713–718. [Google Scholar] [CrossRef]
  74. Bray, A.; Tangney, B. Technology usage in mathematics education research—A systematic review of recent trends. Comput. Educ. 2017, 114, 255–273. [Google Scholar] [CrossRef]
  75. Oates, G. Technology in Mathematics Education: A Stocktake & Crystal-Ball Gazing. In Proceedings of the Asian Technology Conference in Mathematics (ATCM) 2016: Teaching and Learning Mathematics, Science and Engineering through Technology, Pattaya, Thailand, 14–18 December 2016; pp. 1–17. Available online: https://atcm.mathandtech.org/EP2016/invited.html (accessed on 15 November 2022).
  76. Conole, G.; de Laat, M.; Dillon, T.; Darby, J. “Disruptive technologies”, “pedagogical innovation”: What’s new? Findings from an in-depth study of students’ use and perception of technology. Comput. Educ. 2008, 50, 511–524. [Google Scholar] [CrossRef]
  77. Oliver, M. Technological determinism in educational technology research: Some alternative ways of thinking about the relationship between learning and technology. J. Comput. Assist. Learn. 2011, 27, 373–384. [Google Scholar] [CrossRef] [Green Version]
  78. Selwyn, N. Sharpening the ‘ed-tech imagination’: Improving academic research in education and technology. In Proceedings of the Critical Perspectives of Learning with New Media, Melbourne, Australia, 23 March 2012; pp. 6–16. [Google Scholar]
  79. Drijvers, P. Head in the clouds, feet on the ground—A realistic view on using digital tools in mathematics education. In Vielfältige Zugänge zum Mathematikunterricht; Büchter, A., Glade, M., Herold-Blasius, R., Klinger, M., Schacht, F., Scherer, P., Eds.; Springer Spektrum: Wiesbaden, Germany, 2019; pp. 163–176. [Google Scholar] [CrossRef]
  80. Jarvis, D.; Buteau, C.; Doran, C.; Novoseltsev, A. Innovative CAS Technology Use in University Mathematics Teaching and Assessment: Findings from a Case Study in Alberta, Canada. J. Comput. Math. Sci. Teach. 2018, 37, 309–354. [Google Scholar]
  81. Drijvers, P. Empirical Evidence for Benefit? Reviewing Quantitative Research on the Use of Digital Tools in Mathematics Education. In Uses of Technology in Primary and Secondary Mathematics Education; Ball, L., Drijvers, P., Ladel, S., Siller, H.-S., Tabach, M., Vale, C., Eds.; Springer: Cham, Switzerland, 2018; pp. 161–175. [Google Scholar] [CrossRef]
  82. Ronau, R.N.; Rakes, C.R.; Bush, S.B.; Driskell, S.O.; Niess, M.L.; Pugalee, D.K. A Survey of Mathematics Education Technology Dissertation Scope and Quality: 1968–2009. Am. Educ. Res. J. 2014, 51, 974–1006. [Google Scholar] [CrossRef]
  83. Kirkwood, A.; Price, L. Technology-enhanced learning and teaching in higher education: What is “enhanced” and how do we know? A critical literature review. Learn. Media. Technol. 2014, 39, 6–36. [Google Scholar] [CrossRef]
  84. Higher Education Funding Council for England. Enhancing Learning and Teaching through the Use of Technology. 2009. Available online: https://dera.ioe.ac.uk/140/1/09_12.pdf (accessed on 15 November 2022).
  85. National Forum for the Enhancement of Teaching and Learning in Higher Education. National Survey on the Use of Technology to Enhance Teaching and Learning in Higher Education 2014 National Forum. 2015. Available online: https://www.teachingandlearning.ie/publication/national-survey-on-the-use-of-technology-to-enhance-teaching-and-learning-in-higher-education-2014/ (accessed on 15 November 2022).
  86. Dimitriadis, Y.; Goodyear, P. Forward-oriented design for learning: Illustrating the approach. Res. Learn. Technol. 2013, 21, 1–13. [Google Scholar] [CrossRef]
  87. Conole, G. Designing for Learning in an Open World; Springer: New York, NY, USA, 2013. [Google Scholar]
  88. Goodyear, P. Teaching as design. HERDSA Rev. High. Educ. 2015, 2, 27–50. Available online: http://www.herdsa.org.au/wp-content/uploads/HERDSARHE2015v02p27.pdf (accessed on 15 November 2022).
  89. Laurillard, D. Teaching as a Design Science: Building Pedagogical Patterns for Learning and Technology; Routledge: London, UK, 2012. [Google Scholar]
  90. Allen, M.; Sites, R. Leaving ADDIE for SAM: An Agile Model for Developing the Best Learning, 1st ed.; American Society for Training and Development: Alexandria, VA, USA, 2012. [Google Scholar]
  91. Branch, R.M.; Kopcha, T.J. Instructional design models. In Handbook of Research on Educational Communications and Technology, 4th ed.; Spector, J., Merrill, M., Elen, J., Bishop, M., Eds.; Springer,: New York, NY, USA, 2014; pp. 77–87. [Google Scholar] [CrossRef]
  92. Dousay, T. Instructional Design Models. In Foundations of Learning and Instructional Design Technology, 1st ed.; West, R.E., Ed.; Pressbooks: Montreal, QC, Canada, 2018; Available online: https://lidtfoundations.pressbooks.com (accessed on 15 November 2022).
  93. Monaghan, J.; Trouche, L.; Borwein, J.M. Tools and Mathematics Instruments for Learning; Springer: Cham, Switzerland, 2016. [Google Scholar]
  94. Trenholm, S.; Alcock, L.; Robinson, C.L. An investigation of assessment and feedback practices in fully asynchronous online undergraduate mathematics courses. Int. J. Math. Educ. Sci. Technol. 2015, 46, 1197–1221. [Google Scholar] [CrossRef]
  95. Trgalová, J.; Clark-Wilson, A.; Weigand, H.-G. Technology and resources in mathematics education. In Developing Research in Mathematics Education, 1st ed.; Dreyfus, T., Artigue, M., Potari, D., Prediger, S., Ruthven, K., Eds.; Routledge: London, UK, 2018; pp. 142–161. Available online: https://www.taylorfrancis.com/chapters/technology-resources-mathematics-education-jana-trgalová-alison-clark-wilson-hans-georg-weigand/e/10.4324/9781315113562-12 (accessed on 29 November 2022).
  96. Kurz, T.L.; Middleton, J.A.; Yanik, H.B. A Taxonomy of Software for Mathematics Instruction. Contemp. Iss. Technol. Teach. Educ. 2005, 5, 123–137. [Google Scholar]
  97. Jupri, A.; Drijvers, P.; Van den Heuvel-Panhuizen, M. An instrumentation theory view on students’ use of an Applet for Algebraic substitution. Int. J. Technol. Math. Educ. 2016, 23, 63–80. [Google Scholar] [CrossRef]
  98. Ratnayake, I.; Oates, G.; Thomas, M.O.J. Supporting Teachers Developing Mathematical Tasks with Digital Technology. In Proceedings of the 39th annual conference of the Mathematics Education Research Group of Australasia, Adelaide, Australia, 3–7 July 2016; pp. 543–551. [Google Scholar]
  99. Trouche, L.; Drijvers, P. Webbing and orchestration. Two interrelated views on digital tools in mathematics education. Teach. Math. Applic. 2014, 33, 193–209. [Google Scholar] [CrossRef]
  100. Loch, B.; Gill, O.; Croft, T. Complementing mathematics support with online MathsCasts. ANZIAM J. 2012, 53, C561–C575. [Google Scholar] [CrossRef] [Green Version]
  101. Robinson, M.; Loch, B.; Croft, T. Student Perceptions of Screencast Feedback on Mathematics Assessment. Int. J. Res. Undergrad. Math. Educ. 2015, 1, 363–385. [Google Scholar] [CrossRef]
  102. Triantafyllou, E.; Timcenko, O.; Student, O.T. Student perceptions on learning with online resources in a flipped mathematics classroom. In Proceedings of the Ninth Congress of the European Society for Research in Mathematics Education, Prague, Czech Republic, 4–8 February 2015; pp. 2573–2579. [Google Scholar]
  103. King, S.O.; Robinson, C.L. Formative Teaching: A Conversational Framework for Evaluating the Impact of Response Technology on Student Experience, Engagement and Achievement. In Proceedings of the 2009 39th IEEE Frontiers in Education Conference, San Antonio, TX, USA, 18–21 October 2009; pp. 1–6. [Google Scholar] [CrossRef]
  104. Lee, J. An Exploratory Study of Effective Online Learning: Assessing Satisfaction Levels of Graduate Students of Mathematics Education Associated with Human and Design Factors of an Online Course. Int. Rev. Res. Open Distrib. Learn. 2014, 15, 111–132. [Google Scholar] [CrossRef]
  105. Trenholm, S.; Alcock, L.; Robinson, C.L. Mathematics lecturing in the digital age. Int. J. Math. Educ. Sci. Technol. 2012, 43, 703–716. [Google Scholar] [CrossRef]
  106. Gibson, J. The theory of affordances. In Perceiving, Acting, and Knowing; Shaw, R., Bransford, J., Eds.; Laurence Erlbaum: Hillsdale, NJ, USA, 1977. [Google Scholar]
  107. Norman, D.A. The Design of Everyday Things; Basic Books: New York, NY, USA, 1988. [Google Scholar]
  108. Conole, G.; Dyke, M. Understanding and using technological affordances: A response to Boyle and Cook. Res. Learn. Technol. 2004, 12, 301–308. [Google Scholar] [CrossRef]
  109. Oliver, M. Learning technology: Theorising the tools we study. Br. J. Educ. Technol. 2013, 44, 31–43. [Google Scholar] [CrossRef]
  110. Ball, L.; Drijvers, P.; Ladel, S.; Siller, H.-S.; Tabach, M.; Vale, C. (Eds.) Uses of Technology in Primary and Secondary Mathematics Education; Springer: Cham, Switzerland, 2018. [Google Scholar] [CrossRef]
  111. Borwein, J.M. The experimental mathematician: The pleasure of discovery and the role of proof. Int. J. Comput. Math. Learn. 2005, 10, 75–108. [Google Scholar] [CrossRef]
  112. Drijvers, P.; Ball, L.; Barzel, B.; Heid, M.K.; Cao, Y.; Maschietto, M. Uses of Technology in Lower Secondary Mathematics Education; Springer: Cham, Switzerland, 2016. [Google Scholar] [CrossRef]
  113. Oates, G. Integrated Technology in Undergraduate Mathematics: Issues of Assessment. Electron. J. Math. Technol. 2010, 4, 162–174. [Google Scholar]
  114. Artigue, M. Learning mathematics in a CAS environment: The genesis of a reflection about instrumentation and the dialectics between technical and conceptual work. Int. J. Comput. Math. Learn. 2002, 7, 245–274. [Google Scholar] [CrossRef]
  115. Loch, B.; Jordan, C.R.; Lowe, T.W.; Mestel, B.D. Do screencasts help to revise prerequisite mathematics? An investigation of student performance and perception. Int. J. Math. Educ. Sci. Technol. 2014, 45, 256–268. [Google Scholar] [CrossRef]
  116. Rakes, C.R.; Valentine, J.C.; Mcgatha, M.B.; Ronau, R.N. Methods of Instructional Improvement in Algebra: A Systematic Review and Meta-Analysis. Rev. Educ. Res. 2010, 80, 372–400. [Google Scholar] [CrossRef]
  117. Takači, D.; Stankov, G.; Milanovic, I. Efficiency of learning environment using GeoGebra when calculus contents are learned in collaborative groups. Comput. Educ. 2015, 82, 421–431. [Google Scholar] [CrossRef]
  118. Jaworski, B.; Matthews, J. Developing teaching of mathematics to first year engineering students. Teach. Math. Applic. 2011, 30, 178–185. [Google Scholar] [CrossRef]
  119. Howard, E.; Meehan, M.; Parnell, A. Live lectures or online videos: Students’ resource choices in a first-year university mathematics module. Int. J. Math. Educ. Sci. Technol. 2018, 49, 530–553. [Google Scholar] [CrossRef]
  120. McKnight, C.; Magid, A.; Murphy, T.J.; McKnight, M. Mathematics Education Research: A Guide for the Research Mathematician; American Mathematical Society: Providence, RI, USA, 2000. [Google Scholar]
  121. King, S.O.; Robinson, C.L. ‘Pretty Lights’ and Maths! Increasing student engagement and enhancing learning through the use of electronic voting systems. Comput. Educ. 2009, 53, 189–199. [Google Scholar] [CrossRef]
  122. Thiel, T.; Peterman, S.; Brown, M. Addressing the Crisis in College Mathematics: Designing Courses for Student Succes. Change Mag. High. Learn. 2008, 40, 44–49. [Google Scholar] [CrossRef]
  123. Brown, B.; Jacobsen, M.; Lambert, D. Learning technologies in higher education. In Proceedings of the IDEAS: Rising to Challenge Conference, University of Calgary, Calgary, AB, Canada, 9–10 May 2014; pp. 25–43. [Google Scholar]
  124. Lai, J.; Bower, M. How is the use of technology in education evaluated? A systematic review. Comput. Educ. 2019, 133, 27–42. [Google Scholar] [CrossRef]
  125. Ruben, R. Puentedura’s Blog: Transformation, Technology, and Education. 2006. Available online: http://hippasus.com/resources/tte/ (accessed on 15 November 2022).
  126. FaSMEd: FaSMEd Framework. 2020. Available online: https://microsites.ncl.ac.uk/fasmedtoolkit/theory-for-fa/the-fasmed-framework/ (accessed on 15 November 2022).
  127. Buchanan, T.; Sainter, P.; Saunders, G. Factors affecting faculty use of learning technologies: Implications for models of technology adoption. J. Comput. High. Educ. 2013, 25, 1–11. [Google Scholar] [CrossRef]
  128. Nikou, S.A.; Economides, A.A. Mobile-based assessment: Investigating the factors that influence behavioral intention to use. Comput. Educ. 2017, 109, 56–73. [Google Scholar] [CrossRef]
  129. Zogheib, B.; Rabaa’i, A.; Zogheib, S.; Elsaheli, A. University student perceptions of technology use in mathematics learning. J. Inf. Tech. Educ. Res. 2015, 14, 417–438. [Google Scholar] [CrossRef] [PubMed]
  130. Mishra, P.; Koehler, M.J. Technological pedagogical content knowledge: A framework for teacher knowledge. Teach. Coll. Rec. 2006, 108, 1017–1054. [Google Scholar] [CrossRef]
  131. 3E Education: 3E Framework. Available online: https://3eeducation.org/3e-framework/ (accessed on 15 November 2022).
  132. Aparicio, M.; Bacao, F.; Oliveira, T. An e-Learning Theoretical Framework. Educ. Technol. Soc. 2016, 19, 292–307. [Google Scholar]
  133. Laurillard, D. Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies, 2nd ed.; Routledge: London, UK, 2013. [Google Scholar]
  134. Venkatesh, V.; Thong, J.Y.L.; Xu, X. Unified theory of acceptance and use of technology: A synthesis and the road ahead. J. Assoc. Inf. Syst. 2016, 17, 328–376. [Google Scholar] [CrossRef]
  135. Kieran, C.; Drijvers, P. Digital Technology and Mathematics Education: Core Ideas and Key Dimensions of Michèle Artigue’s Theoretical Work on Digital Tools and Its Impact on Mathematics Education Research. In The Didactics of Mathematics: Approaches and Issues; Hodgson, B.R., Kuzniak, A., Lagrange, J.-B., Eds.; Springer: Cham, Switzerland, 2016; pp. 123–142. [Google Scholar] [CrossRef]
  136. Lopes, J.B.; Costa, C. Digital Resources in Science, Mathematics and Technology Teaching—How to convert Them into Tools to Learn. In Technology and Innovation in Learning, Teaching and Education. TECH-EDU 2018. Communications in Computer and Information Science; Tsitouridou, M., Diniz, J.A., Mikropoulos, T., Eds.; Springer Nature: Cham, Switzerland, 2019; Volume 993, pp. 243–255. [Google Scholar] [CrossRef]
  137. National Research Coucil. Adding it Up: Helping Children Learn Mathematics; Kilpatrick, J., Swafford, J., Findell, B., Eds.; The National Academies Press: Washington DC, USA, 2001. [Google Scholar] [CrossRef]
  138. Handal, B.; El-Khoury, J.; Campbell, C.; Cavanagh, M. A framework for categorising mobile applications in mathematics education. In Proceedings of the Australian Conference on Science and Mathematics Education, Australian National University, Canberra, Australia, 19–21 September 2013; pp. 142–147. [Google Scholar]
  139. Bower, M. Deriving a typology of Web 2.0 learning technologies. Br. J. Educ. Technol. 2016, 47, 763–777. [Google Scholar] [CrossRef]
  140. Abderraim, E.M.; Mohamed, E.; Azeddine, N. An evaluation model of digital educational resources. Int. J. Emerg. Technol. Learn. 2013, 8, 29–35. [Google Scholar] [CrossRef]
  141. Goodwin, K. Use of Tablet Technology in the Classroom. In New South Wales Curriculum and Learning Innovation Centre Report. 2012. Available online: https://cpb-ap-se2.wpmucdn.com/global2.vic.edu.au/dist/1/42368/files/2014/04/iPad_Evaluation_Sydney_Region_exec_sum-1pjdj70.pdf (accessed on 15 November 2022).
  142. Pechenkina, E. Developing a typology of mobile apps in higher education: A national case-study. Australas. J. Educ. Technol. 2017, 33, 134–146. [Google Scholar] [CrossRef]
  143. Hoyles, C.; Noss, R. The technological mediation of mathematics and its learning. Hum. Dev. 2009, 52, 129–147. [Google Scholar] [CrossRef]
  144. Intertwingled: Peter Morville’s User Experience Honeycomb. 2016. Available online: https://intertwingled.org/user-experience-honeycomb/ (accessed on 15 November 2022).
  145. CAST: The UDL Guidelines. 2018. Available online: http://udlguidelines.cast.org/ (accessed on 15 November 2022).
  146. Baldwin, S.J.; Ching, Y.H. An online course design checklist: Development and users’ perceptions. J. Comput. High. Educ. 2019, 31, 156–172. [Google Scholar] [CrossRef]
  147. Atkinson, S. Embodied and Embedded Theory in Practice: The Student-Owned Learning-Engagement (SOLE) Model. Int. Rev. Res. Open Distrib. Learn. 2011, 12, 1–18. [Google Scholar] [CrossRef]
  148. Kirkpatrick, D.; Kirkpatrick, J. Evaluating Training Programs: The Four Levels, 3rd ed.; Berrett-Koehler Publishers: Oakland, CA, USA, 2006. [Google Scholar]
  149. Pickering, J.D.; Joynes, V.C.T. A holistic model for evaluating the impact of individual technology-enhanced learning resources. Med. Teach. 2016, 38, 1242–1247. [Google Scholar] [CrossRef]
  150. Rodríguez, P.; Nussbaum, M.; Dombrovskaia, L. Evolutionary development: A model for the design, implementation, and evaluation of ICT for education programmes. J. Comput. Assist. Learn. 2012, 28, 81–98. [Google Scholar] [CrossRef]
  151. Gueudet, G.; Pepin, B. Didactic Contract at the Beginning of University: A Focus on Resources and their Use. Int. J. Res. Undergrad. Math. Educ. 2018, 4, 56–73. [Google Scholar] [CrossRef]
  152. Wiliam, D.; Thompson, M. Integrating Assessment with Learning: What Will It Take to Make It Work. In The Future of Assessment: Shaping Teaching and Learning, 1st ed.; Dwyer, C.A., Ed.; Routledge: New York, NY, USA, 2008; pp. 53–82. [Google Scholar] [CrossRef]
  153. Adnan, N.H.; Ritzhaupt, A. Software Engineering Design Principles Applied to Instructional Design: What can we Learn from our Sister Discipline? TechTrends 2018, 62, 77–94. [Google Scholar] [CrossRef]
  154. Svihla, V. Design Thinking and Agile Design. In Foundations of Learning and Instructional Design Technology, 1st ed.; West, R.E., Ed.; Pressbooks: Montreal, QC, Canada, 2018; Available online: https://lidtfoundations.pressbooks.com (accessed on 15 November 2022).
  155. Ruben, R. Puentedura’s Blog: SAMR and TPCK: Intro to Advanced Practice. 2010. Available online: http://hippasus.com/resources/sweden2010/SAMR_TPCK_IntroToAdvancedPractice.pdf (accessed on 15 November 2022).
  156. Drijvers, P.; Monaghan, J.; Thomas, M.O.J.; Trouche, L. Use of Technology in Secondary Mathematics. In Final Report for the International Baccalaureate. 2014. Available online: https://hal.archives-ouvertes.fr/hal-01546747 (accessed on 15 November 2022).
  157. Handal, B.; Campbell, C.; Cavanagh, M.; Petocz, P.; Kelly, N. Integrating Technology, Pedagogy and Content in Mathematics Education. J. Comput. Math. Sci. Teachnol. 2012, 31, 387–413. [Google Scholar]
  158. Hilton, J.T. A Case Study of the Application of SAMR and TPACK for Reflection on Technology Integration into Two Social Studies Classrooms. Soc. Stud. 2016, 107, 68–73. [Google Scholar] [CrossRef]
  159. Kimmons, R.; Hall, C. How Useful are our Models? Pre-Service and Practicing Teacher Evaluations of Technology Integration Models. TechTrends 2018, 62, 29–36. [Google Scholar] [CrossRef]
  160. Reeves, T.C.; Benson, L.; Elliott, D.; Grant, M.; Holschuh, D.; Kim, B.; Kim, H.; Lauber, E.; Loh, S. Usability and Instructional Design Heuristics for E-Learning Evaluation. In Proceedings of the ED-MEDIA 2002 World Conference on Educational Multimedia, Hypermedia & Telecommunications, Denver, CO, USA, 24–29 June 2002. [Google Scholar]
  161. Squires, D.; Preece, J. Predicting quality in educational software: Evaluating for learning, usability and the synergy between them. Interact. Comput. 1999, 11, 467–483. [Google Scholar] [CrossRef]
  162. Slade, C.; Downer, T. Students’ conceptual understanding and attitudes towards technology and user experience before and after use of an ePortfolio. J. Comput. High. Educ. 2020, 32, 529–552. [Google Scholar] [CrossRef]
  163. Molich, R.; Nielsen, J. Improving a human-computer dialogue. Commun. ACM 1990, 33, 338–348. [Google Scholar] [CrossRef]
  164. NN/g Nielsen Norman Group: Jakob Nielsen’s 10 Heuristics for User Interface Design. 2020. Available online: http://www.nngroup.com/articles/ten-usability-heuristics/ (accessed on 15 November 2022).
  165. JISC Guide: Usability and User Experience. 2014. Available online: https://www.jisc.ac.uk/guides/usability-and-user-experience (accessed on 15 November 2022).
  166. Meyer, A.; Rose, D.H.; Gordon, D. Universal Design for Learning: Theory and Practice; CAST Professional Publishing: Wakefield, MA, USA, 2014. [Google Scholar]
  167. Rose, D.H.; Meyer, A. Teaching Every Student in the Digital Age: Universal Design for Learning; Association for Supervision and Curriculum Development (ASCD): Alexandria, VA, USA, 2002. [Google Scholar]
  168. Kumar, K.L.; Wideman, M. Accessible by design: Applying UDL principles in a first year undergraduate course. Can. J. High. Educ. 2014, 44, 125–147. [Google Scholar] [CrossRef]
  169. Drijvers, P.; Tacoma, S.; Besamusca, A.; Doorman, M.; Boon, P. Digital resources inviting changes in mid-adopting teachers’ practices and orchestrations. ZDM—Math. Educ. 2013, 45, 987–1001. [Google Scholar] [CrossRef]
Figure 1. Conceptual mapping of the three research areas (RAs) showing the number of articles included from each area and their overlap.
Figure 1. Conceptual mapping of the three research areas (RAs) showing the number of articles included from each area and their overlap.
Mathematics 11 00787 g001
Figure 2. The refined conceptual framework of student engagement, redrawn from the original in Kahu and Nelson [4] (p. 64).
Figure 2. The refined conceptual framework of student engagement, redrawn from the original in Kahu and Nelson [4] (p. 64).
Mathematics 11 00787 g002
Figure 3. Didactical Tetrahedron redrawn from the original in Trgalová, Clark-Wilson, and Weigand [95] (p. 2).
Figure 3. Didactical Tetrahedron redrawn from the original in Trgalová, Clark-Wilson, and Weigand [95] (p. 2).
Mathematics 11 00787 g003
Figure 4. SAMR model for technology integration in education redrawn from the original in Puentedura [155] (p. slide 3).
Figure 4. SAMR model for technology integration in education redrawn from the original in Puentedura [155] (p. slide 3).
Mathematics 11 00787 g004
Figure 5. Morville’s Honeycomb redrawn from the original available at https://www.jisc.ac.uk/guides/usability-and-user-experience (accessed on 29 November 2022).
Figure 5. Morville’s Honeycomb redrawn from the original available at https://www.jisc.ac.uk/guides/usability-and-user-experience (accessed on 29 November 2022).
Mathematics 11 00787 g005
Table 1. Inclusion and exclusion criteria.
Table 1. Inclusion and exclusion criteria.
Literature to IncludeLiterature to ExcludeDatabases to Include
2001 onwards,
Peer reviewed,
Higher/post-primary education,
Published in English,
Full text available in library or online
Reports,
Grey literature,
Primary-school education
Education Research Complete, British Education Index,
ERIC and Academic Search Complete (all available via EBSCO), Web of Science, Scopus
Table 2. Literature search terms and number of articles for each of the three interconnecting research areas.
Table 2. Literature search terms and number of articles for each of the three interconnecting research areas.
Research
Area
Search TermsNumber of Articles after Final Scan
RA1‘student engagement’, ‘technology’, ‘technology use’, ‘digital tools’, ‘higher education’, ‘undergraduate education’, ‘mathematics’.45
RA2‘mathematics educational technology’ or ‘mathematics technology tools’, ‘evaluations’, and ‘investigations’, and ‘undergraduate’ or ‘higher education’.61
RA3RA2 search terms plus ‘frameworks’, ‘models’, ‘categorisations’, ‘characterisations’, ‘typologies’, and ‘classifications’.88
Duplicated papers40
Seminal added (prior to year range 2000–2020)6
Table 3. Factors that influence engagement with educational technology in mathematics.
Table 3. Factors that influence engagement with educational technology in mathematics.
StudyEngagement Dimension and Indicator MeasuredPedagogical Use of TechnologyFactor and/or Impact
Trenholm et al. [51] Cognitive engagement:
Scale to measure approach to learning (R-SPQ-2F)
Optional use of live versus recorded lectures.Students used videos because of self-paced nature of their availability. Students with high use of videos more inclined to take surface approach to learning than others.
Steen-Utheim and Foldnes [50]Affective Engagement:
Kahu’s model of student engagement [8]
Flipped classroom approach in 1st year undergraduate mathematics course.Peer and lecturer relationships, and possibly class size, influenced positive engagement outcome.
Kanwal [57]Behaviour engagement:
Activity Theory
Automated system to support solving of mathematical tasks, variety of technology resources including GeoGebra, MyMathlab, YouTube, and online calculators.Exam preparation encouraged engagement. Using powerful automated calculators diverted students from engagement with required mathematical operations.
Thomas et al. [55]Cognitive Engagement:
Instrumental orchestration
Variety of innovative technologies and tasks including Desmos, GeoGebra, KakooTalk.Engagement ensured through sustained intensive use of technologies; teacher privileging of technology; ease of use; ability to visualise mathematics; and integration in assessment.
Anastasakis et al. [69]Behaviour engagement:
Activity Theory
Self-selected resources (both digital and non-digital) 2nd year engineering mathematics.High mark in exams was student goal for selecting and engaging in resource.
Table 4. Benefits of using technology in higher education.
Table 4. Benefits of using technology in higher education.
CategoryBenefitsStudiesTechnology UsedContext
PragmaticCalculations and graphingJarvis et al. [78]SageHE M
Varavsky (as cited in [14]) Computer Algebra System (CAS)1Y UM
Thomas et al. [52]Multiple technologies1Y UM
EpistemicProblem SolvingLoch et al. [114] Screencast1Y UM
Takači et al. [116]Computer supported collaborative learning (CSCL)1Y UM
Mathematical UnderstandingGalligan et al. [53]Tablets1Y UM
Takači et al. [116]CSCL1Y UM
Triantafyllou et al. [101]Multiple technologies1Y UM
Aventi (as cited in [14])GeoGebraYear 9 maths
(Australasia)
Thomas et al. [52]Multiple technologies1Y UM
Buteau et al. [71]CASHE M
Rote Learning (negative)Trenholm et al. [104] e-lecturesHE M
VisualisationJarvis et al. [78]SageHE M
Lavicza [81]CASHE M
Takači et al. [116]GeoGebra1Y UM
Jaworski and Matthews [117] GeoGebra1Y UM
Thomas et al. [52]Multiple technologies1Y UM
FeedbackTrenholm et al. [93]Fully Asynchronous Online (FAO) HE M
King and Robinson [102]Audience Response Systems (ARS)HE M
Lee [103] Online quizzesHE M
Real World ProblemsJarvis et al. [78]SageHE M
Lavicza [81]CASHE M
Conceptual and Procedural UnderstandingRakes et al. [115] Various strategies that included technologyMathematics Education
OtherEngagement (motivation)Loch et al. [114] Screencasts1Y UM
Galligan et al. [53]Tablets1Y UM
King and Robinson [102]ARS2Y EM
Thomas et al. [52]Multiple technologies1Y UM
Buteau et al. [71]CAS technologiesHE M
Self-regulated learning, self-paced, and self-directed learningLoch et al. [114] Screencast 1Y UM
Trenholm et al. [104]Recorded Video lecturesHE M
Jarvis et al. [78]SageHE M
Triantafyllou et al. [101]Khan Academy and other online resources1Y UM
Buteau et al. [71]CASHE M
Howard et al. [118]Recorded Video lectures1Y UM
Kanwal [54]Online learning environment1Y UM
SatisfactionTrenholm et al. [104]Recorded Video lecturesHE M
King and Robinson [102]ARS2Y EM
Triantafyllou et al. [101]Khan Academy and other online resources1Y UM
Lee [103]Online learning technologies Graduate students
Classroom ManagementKing and Robinson [102]ARS2Y EM
AssessmentOates [112]CASHE M
Approaches to learningTrenholm et al. [48]Recorded video lectures1Y UM
Howard et al. [118]Recorded video lectures1Y UM
Table 5. Measures taken in the studies.
Table 5. Measures taken in the studies.
MeasureStudy
Student and/or teacher views of resources through use of surveys, scales, or questionnairesJaworski and Matthews [118], King and Robinson [121], Lee [104], Lavicza [28], Loch et al. [115], Oates [113], Thiel et al. [122], Thomas et al. [55], Trenholm et al. [51,94], Triantafyllou et al. [102], Howard et al. [119].
Test, exam, or quiz results for improved students’ mathematical understandingJaworski and Matthews [118], King and Robinson [121], Loch et al. [100], Takači et al. [117], Howard et al. [119].
Recorded usage of resourcesLoch et al. [100], Trenholm et al. [51], Howard et al. [119].
Attendance dataKing and Robinson [121], Howard et al. [119]
Course artefacts and/or curriculum materialsJarvis et al. [80], Lavicza [28], Thomas et al. [55]
Student and/or teacher interviewsJarvis et al. [80], Jaworski and Matthews [118], King and Robinson [121], Lavicza [28]
Teacher practices, reflections, and/or blogsGalligan et al. [56], Jaworski and Matthews [118], King and Robinson [121].
Class observationsJaworski and Matthews [118], King and Robinson [121], Lavicza [28], Thomas et al. [55].
Task analysisTakači et al. [117], Thomas et al. [55].
Scale to measure approach to learning (scale used is R-SPQ-2F)Trenholm et al. [51,94].
Case StudyDrijvers [11].
Table 6. Theoretical Frameworks used in the studies.
Table 6. Theoretical Frameworks used in the studies.
TheoryStudy
Community of inquiry (CoI) and documental genesis Jaworski and Matthews [118]
Computer-supported collaborative learning (CSCL) Takači et al. [117]
Laurillard conversational framework King and Robinson [103]
Conceptual model of affective and cognitive effects of human and design factorsPiccoli et al. (as cited in Lee [104])
Instrumental orchestrationThomas et al. [55]
Taxonomy for integrated technology
(author’s own version from Ph.D. thesis)
Oates [113]
Table 7. Frameworks used in the integration and evaluation of technology.
Table 7. Frameworks used in the integration and evaluation of technology.
GroupFrameworkDescription/PurposeStudy or Website
Technology integrationSubstitution Augmentation Modification and Redefinition (SAMR)Model describes 4 levels of technology integration in taskshttp://hippasus.com/resources/tte/
Puentedura [125]
Formative Assessment in Science and Mathematics Education (FaSMEd) *Characterisation of aspects of classroom integration of formative assessment technology toolshttps://microsites.ncl.ac.uk/fasmedtoolkit/theory-for-fa/the-fasmed-framework/
FaSMEd [126]
Technology Acceptance Model (TAM)Theorises usage behaviour of technologyhttps://en.wikipedia.org/wiki/Technology_acceptance_model
Buchanan et al. [127]
Nikou and Economides [128] Zogheib et al. [129]
Technological pedagogical content knowledge (TPACK) ***Framework considers intersection of teachers’ knowledge on technology, pedagogy, and content key to successful technology integration.Mishra and Koehler [130]
Classification system * (Bray and Tangney **) [74]Classification system with 4 components: Technology, Learning Theory, SAMR level, Purpose.Bray and Tangney [74]
3E (Enhance, Extend, Empower) FrameworkGuidance and examples to exploit technology to enhance, extend, empower teaching and learning.https://3eeducation.org/3e-framework/
[131]
eLearning theoretical frameworkeLearning systems theory framework that draws out roles of people, technology, and services in learning provision,Aparicio et al. [132]
Laurillard Conversational FrameworkFramework describes interactions and types of activities that occur between teachers and students for effective learning.King and Robinson [103] Laurillard [133]
Unified theory of acceptance and use of technology (UTUAT)Alternative to TAM—4 key factors in accepting technology: performance expectancy, effort expectancy, social influence, facilitating conditions.Venkatesh et al. [134]
4C (Connection, Communication, Collaboration, Creating) FrameworkFramework to organise technology use in higher education.Brown et al. [123]
Theoretical FrameworksInstrumental Orchestration *Converting digital tools into artefacts, connecting technical skills and conceptual understanding required.Artigue [114]
Kieran and Drijvers [135]
Lopes and Costa [136]
Thomas et al. [55]
Didactic Tetrahedron *Examining digital tool use as interactions between (1) tools and knowledge, (2) tools, knowledge and learner, and integration of (3) tools in curriculum or classroom.Trgalová et al. [95]
Mathematical Proficiency *Five strands of mathematical proficiency required to learn maths successfully.National Research Council [137]
Pedagogical Opportunities *Ten pedagogical opportunities grouped into 3 levels: task that has been set, classroom interaction, maths topic.Pierce and Stacey [15]
Didactical Functions *Three didactical functions supported by technology: (1) Do, (2) Learn–Practice Skills, and (3) Learn-concepts.Drijvers [11]
Technology Affordances and TypesMobile App Categorisation * (Handal **)Categorises use of mobile apps for schools based on instructional roles and media richness as: Productive, Explorative and Instructive. Uses Goodwin’s classification—see below.Handal et al. [138]
Web 2 typology (Bower **)Typology of web 2 tools suitable for teaching and learning; includes what they have been used for, pedagogical uses and examples.Bower [139] (p. 772)
Evaluation Grid for multimedia tools (Abderrahim, Mohamed and Azeddine **)Checklist to ascertain quality of multimedia tools: pedagogical, didactical, and technical. Derived from tools used in secondary education in Morocco.Abderrahim et al. [140]
Classification of Mobile Apps (Goodwin **)Precursor to Handal’s categorisation concerned with users’ level of control over tasks and activities, for school-based apps: Instructive, Manipulative, and Constructive.Goodwin [141] (p. 26)
Typology of mobile apps (Pechenkina **)Typology of mobile apps used in higher education institutions in Australia by order of most used types: Organiser, Navigator, and Instructive.Pechenkina [142] (pp. 139–140)
Categories of digital tools *
(Hoyles and Noss **)
Four categories of tools:
(1) dynamic and graphical tools, (2) tools that outsource processing power, (3) new representational infrastructures, and (4) implications of high-bandwidth connectivity on nature of maths activity.
Hoyles and Noss [143]
Experimental mathematician * (Borwein **)Use or affordances of a computer in mathematics, focusing on proofs.Borwein [111]
User ExperienceUser Experience HoneycombSeven attributes of technology deemed desirable to enhance student experience of using technology.Morville [144]
Universal Design for Learning (UDL)Framework used to provide fully inclusive learning environment for all students. Three main elements: Engagement, Representation, and Action and Expression, considering multiple means to achieve these.Center for Applied Special Technology (CAST) [145]
Online Course Design Learning Checklist (OCDLC)Before, during, and after checklist, with 3, 6, and 10 items, respectively, for online courses in higher education.Baldwin and Ching [146]
Student-Owned Learning-Engagement (SOLE) modelTheoretical Framework on eLearning systems with 3 dimensions: users, technology, and services.Atkinson [147]
FEM Framework for Engagement in Mathematics (FEM) *Three aspects: Pedagogical Relationships (between students and teachers), Pedagogical Repertoires (teacher day-to-day teaching practices), and Student Engagement (factors supporting engagement). Attard and Holmes [67]
* Framework designed specifically for mathematics education studies. ** Where framework does not have associated distinguishable name, author(s) have been included with name. *** Originally called TPCK by Mishra and Koehler but now commonly referred to as TPACK [130].
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ní Shé, C.; Ní Fhloinn, E.; Mac an Bhaird, C. Student Engagement with Technology-Enhanced Resources in Mathematics in Higher Education: A Review. Mathematics 2023, 11, 787. https://0-doi-org.brum.beds.ac.uk/10.3390/math11030787

AMA Style

Ní Shé C, Ní Fhloinn E, Mac an Bhaird C. Student Engagement with Technology-Enhanced Resources in Mathematics in Higher Education: A Review. Mathematics. 2023; 11(3):787. https://0-doi-org.brum.beds.ac.uk/10.3390/math11030787

Chicago/Turabian Style

Ní Shé, Caitríona, Eabhnat Ní Fhloinn, and Ciarán Mac an Bhaird. 2023. "Student Engagement with Technology-Enhanced Resources in Mathematics in Higher Education: A Review" Mathematics 11, no. 3: 787. https://0-doi-org.brum.beds.ac.uk/10.3390/math11030787

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop