sensors-logo

Journal Browser

Journal Browser

Advanced Sensors Technology in Education

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: closed (20 June 2019) | Viewed by 69549

Special Issue Editors


E-Mail Website
Guest Editor
Department of Engineering, School of Engineering and Technology, Universidad Internacional de la Rioja (UNIR), Logroño, Spain
Interests: soft computing; accessibility; Artificial Intelligence; learning analytics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Research Institute for Innovation & Technology in Education (UNIR iTED), Universidad Internacional de La Rioja (UNIR), 26006 Logroño, La Rioja, Spain
Interests: adaptive and informal eLearning; educational technology; learning analytics; open education; open science; educational games; serious games; gamification; elearning specifications
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

One of the most well-known requirements in educational settings is the need to know what happens during a course, lesson plan or full academic programme. This is true for any type of education but in particular for open education, which has multiple dimensions of openness. On the one hand, educators (i.e., teachers, professors, tutors, etc.) and practitioners of open education need to reshape the course plan according to the actual features of the learners (e.g., learning styles, motivation, performance, etc.) and they therefore require real-time analytical information to supervise, assess, adapt and offer feedback to the learners. On the other hand, open education offers specific opportunities through online learning using open educational resources (OER). The online environments and platforms provide huge amount of data on all activities (a huge Excel sheet, known as big data).

More importantly, open education with open teaching and learning is now commonly shaped by a learner-centred approach that pushes the learners to be the driver of their own learning. That is, learners require awareness to self-assess their progress along the course and make decisions regarding their next steps.

All kind of sensors that supports these tasks are welcome to improve the quality of this new epoch in the online education paradigm. Different sensors tracking biometrics that allow augmented reality activities, actuators, hardware, like wearables, and different software applications will improve the expected results.

Dr. Ruben Gonzalez Crespo
Prof. Dr. Daniel Burgos
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • technology-enhanced learning
  • augmented reality
  • learning analytics
  • virtual sensors

Published Papers (15 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

5 pages, 202 KiB  
Editorial
Advanced Sensors Technology in Education
by Rubén González Crespo and Daniel Burgos
Sensors 2019, 19(19), 4155; https://0-doi-org.brum.beds.ac.uk/10.3390/s19194155 - 25 Sep 2019
Cited by 5 | Viewed by 2873
Abstract
The topic presented will show how different kinds of sensors can help to improve our skills in learning environments. When we open the mind and let it take the control to be creative, we can think how a martial art would be improved [...] Read more.
The topic presented will show how different kinds of sensors can help to improve our skills in learning environments. When we open the mind and let it take the control to be creative, we can think how a martial art would be improved with registered sensors, or how a person may dance with machines to improve their technique, or how you may improve your soccer kick for a penalties round. The use of sensors seems easy to imagine in these examples, but their use is not limited to these types of learning environments. Using depth cameras to detect patterns in oral presentations, or improving the assessment of agility through low cost-sensors with multimodal learning analytics, or using computing devices as sensors to measure their impact on primary and secondary students’ performances are the focus of this study as well. We hope readers will find original ideas that allow them to improve and advance in their own researches. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)

Research

Jump to: Editorial

18 pages, 13865 KiB  
Article
Intelligent Framework for Learning Physics with Aikido (Martial Art) and Registered Sensors
by Alberto Corbi, Olga C. Santos and Daniel Burgos
Sensors 2019, 19(17), 3681; https://0-doi-org.brum.beds.ac.uk/10.3390/s19173681 - 24 Aug 2019
Cited by 14 | Viewed by 4957
Abstract
Physics is considered a tough academic subject by learners. To leverage engagement in the learning of this STEM area, teachers try to come up with creative ideas about the design of their classroom lessons. Sports-related activities can foster intuitive knowledge about physics (gravity, [...] Read more.
Physics is considered a tough academic subject by learners. To leverage engagement in the learning of this STEM area, teachers try to come up with creative ideas about the design of their classroom lessons. Sports-related activities can foster intuitive knowledge about physics (gravity, speed, acceleration, etc.). In this context, martial arts also provide a novel way of visualizing these ideas when performing the predefined motions needed to master the associated techniques. The recent availability of cheap monitoring hardware (accelerometers, cameras, etc.) allows an easy tracking of the aforementioned movements, which in the case of aikido, usually involve genuine circular motions. In this paper, we begin by reporting a user study among high-school students showing that the physics concept of moment of inertia can be understood by watching live exhibitions of specific aikido techniques. Based on these findings, we later present Phy + Aik, a tool for educators that enables the production of innovative visual educational material consisting of high-quality videos (and live demonstrations) synchronized/tagged with the inertial data collected by sensors and visual tracking devices. We think that a similar approach, where sensors are automatically registered within an intelligent framework, can be explored to teach other difficult-to-learn STEM concepts. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

14 pages, 339 KiB  
Article
Dancing Salsa with Machines—Filling the Gap of Dancing Learning Solutions
by Gianluca Romano, Jan Schneider and Hendrik Drachsler
Sensors 2019, 19(17), 3661; https://0-doi-org.brum.beds.ac.uk/10.3390/s19173661 - 23 Aug 2019
Cited by 18 | Viewed by 6168
Abstract
Dancing is an activity that positively enhances the mood of people that consists of feeling the music and expressing it in rhythmic movements with the body. Learning how to dance can be challenging because it requires proper coordination and understanding of rhythm and [...] Read more.
Dancing is an activity that positively enhances the mood of people that consists of feeling the music and expressing it in rhythmic movements with the body. Learning how to dance can be challenging because it requires proper coordination and understanding of rhythm and beat. In this paper, we present the first implementation of the Dancing Coach (DC), a generic system designed to support the practice of dancing steps, which in its current state supports the practice of basic salsa dancing steps. However, the DC has been designed to allow the addition of more dance styles. We also present the first user evaluation of the DC, which consists of user tests with 25 participants. Results from the user test show that participants stated they had learned the basic salsa dancing steps, to move to the beat and body coordination in a fun way. Results also point out some direction on how to improve the future versions of the DC. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

13 pages, 846 KiB  
Article
Physical and Tactical Demands of the Goalkeeper in Football in Different Small-Sided Games
by Daniel Jara, Enrique Ortega, Miguel-Ángel Gómez-Ruano, Matthias Weigelt, Brittany Nikolic and Pilar Sainz de Baranda
Sensors 2019, 19(16), 3605; https://0-doi-org.brum.beds.ac.uk/10.3390/s19163605 - 19 Aug 2019
Cited by 19 | Viewed by 5729
Abstract
Background: Several studies have examined the differences between the different small-sided game (SSG) formats. However, only one study has analysed how the different variables that define SSGs can modify the goalkeeper’s behavior. The aim of the present study was to analyze how the [...] Read more.
Background: Several studies have examined the differences between the different small-sided game (SSG) formats. However, only one study has analysed how the different variables that define SSGs can modify the goalkeeper’s behavior. The aim of the present study was to analyze how the modification of the pitch size in SSGs affects the physical demands of the goalkeepers. Methods: Three professional male football goalkeepers participated in this study. Three different SSG were analysed (62 m × 44 m for a large pitch; 50 m × 35 m for a medium pitch and 32 m × 23 m for a small pitch). Positional data of each goalkeeper was gathered using an 18.18 Hz global positioning system. The data gathered was used to compute players’ spatial exploration index, standard ellipse area, prediction ellipse area The distance covered, distance covered in different intensities and accelerations/decelerations were used to assess the players’ physical performance. Results and Conclusions: There were differences between small and large SSGs in relation to the distances covered at different intensities and pitch exploration. Intensities were lower when the pitch size was larger. Besides that, the pitch exploration variables increased along with the increment of the pitch size. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

20 pages, 5085 KiB  
Article
Using Depth Cameras to Detect Patterns in Oral Presentations: A Case Study Comparing Two Generations of Computer Engineering Students
by Felipe Roque, Cristian Cechinel, Tiago O. Weber, Robson Lemos, Rodolfo Villarroel, Diego Miranda and Roberto Munoz
Sensors 2019, 19(16), 3493; https://0-doi-org.brum.beds.ac.uk/10.3390/s19163493 - 09 Aug 2019
Cited by 11 | Viewed by 3504
Abstract
Speaking and presenting in public are critical skills for academic and professional development. These skills are demanded across society, and their development and evaluation are a challenge faced by higher education institutions. There are some challenges to evaluate objectively, as well as to [...] Read more.
Speaking and presenting in public are critical skills for academic and professional development. These skills are demanded across society, and their development and evaluation are a challenge faced by higher education institutions. There are some challenges to evaluate objectively, as well as to generate valuable information to professors and appropriate feedback to students. In this paper, in order to understand and detect patterns in oral student presentations, we collected data from 222 Computer Engineering (CE) fresh students at three different times, over two different years (2017 and 2018). For each presentation, using a developed system and Microsoft Kinect, we have detected 12 features related to corporal postures and oral speaking. These features were used as input for the clustering and statistical analysis that allowed for identifying three different clusters in the presentations of both years, with stronger patterns in the presentations of the year 2017. A Wilcoxon rank-sum test allowed us to evaluate the evolution of the presentations attributes over each year and pointed out a convergence in terms of the reduction of the number of features statistically different between presentations given at the same course time. The results can further help to give students automatic feedback in terms of their postures and speech throughout the presentations and may serve as baseline information for future comparisons with presentations from students coming from different undergraduate courses. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

17 pages, 2117 KiB  
Article
Beyond Reality—Extending a Presentation Trainer with an Immersive VR Module
by Jan Schneider, Gianluca Romano and Hendrik Drachsler
Sensors 2019, 19(16), 3457; https://0-doi-org.brum.beds.ac.uk/10.3390/s19163457 - 07 Aug 2019
Cited by 12 | Viewed by 4340
Abstract
The development of multimodal sensor-based applications designed to support learners with the improvement of their skills is expensive since most of these applications are tailor-made and built from scratch. In this paper, we show how the Presentation Trainer (PT), a multimodal sensor-based application [...] Read more.
The development of multimodal sensor-based applications designed to support learners with the improvement of their skills is expensive since most of these applications are tailor-made and built from scratch. In this paper, we show how the Presentation Trainer (PT), a multimodal sensor-based application designed to support the development of public speaking skills, can be modularly extended with a Virtual Reality real-time feedback module (VR module), which makes usage of the PT more immersive and comprehensive. The described study consists of a formative evaluation and has two main objectives. Firstly, a technical objective is concerned with the feasibility of extending the PT with an immersive VR Module. Secondly, a user experience objective focuses on the level of satisfaction of interacting with the VR extended PT. To study these objectives, we conducted user tests with 20 participants. Results from our test show the feasibility of modularly extending existing multimodal sensor-based applications, and in terms of learning and user experience, results indicate a positive attitude of the participants towards using the application (PT+VR module). Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

25 pages, 14841 KiB  
Article
Introducing Low-Cost Sensors into the Classroom Settings: Improving the Assessment in Agile Practices with Multimodal Learning Analytics
by Hector Cornide-Reyes, René Noël, Fabián Riquelme, Matías Gajardo, Cristian Cechinel, Roberto Mac Lean, Carlos Becerra, Rodolfo Villarroel and Roberto Munoz
Sensors 2019, 19(15), 3291; https://0-doi-org.brum.beds.ac.uk/10.3390/s19153291 - 26 Jul 2019
Cited by 22 | Viewed by 4427
Abstract
Currently, the improvement of core skills appears as one of the most significant educational challenges of this century. However, assessing the development of such skills is still a challenge in real classroom environments. In this context, Multimodal Learning Analysis techniques appear as an [...] Read more.
Currently, the improvement of core skills appears as one of the most significant educational challenges of this century. However, assessing the development of such skills is still a challenge in real classroom environments. In this context, Multimodal Learning Analysis techniques appear as an attractive alternative to complement the development and evaluation of core skills. This article presents an exploratory study that analyzes the collaboration and communication of students in a Software Engineering course, who perform a learning activity simulating Scrum with Lego® bricks. Data from the Scrum process was captured, and multidirectional microphones were used in the retrospective ceremonies. Social network analysis techniques were applied, and a correlational analysis was carried out with all the registered information. The results obtained allowed the detection of important relationships and characteristics of the collaborative and Non-Collaborative groups, with productivity, effort, and predominant personality styles in the groups. From all the above, we can conclude that the Multimodal Learning Analysis techniques offer considerable feasibilities to support the process of skills development in students. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

21 pages, 3238 KiB  
Article
Evaluation of the Create@School Game-Based Learning–Teaching Approach
by Eugenio Gaeta, María Eugenia Beltrán-Jaunsaras, Gloria Cea, Bernadette Spieler, Andrew Burton, Rebeca Isabel García-Betances, María Fernanda Cabrera-Umpiérrez, David Brown, Helen Boulton and María T. Arredondo Waldmeyer
Sensors 2019, 19(15), 3251; https://0-doi-org.brum.beds.ac.uk/10.3390/s19153251 - 24 Jul 2019
Cited by 17 | Viewed by 4621
Abstract
The constructivist approach is interested in creating knowledge through active engagement and encourages students to build their knowledge from their experiences in the world. Learning through digital game making is a constructivist approach that allows students to learn by developing their own games, [...] Read more.
The constructivist approach is interested in creating knowledge through active engagement and encourages students to build their knowledge from their experiences in the world. Learning through digital game making is a constructivist approach that allows students to learn by developing their own games, enhancing problem-solving skills and fostering creativity. In this context two tools, Create@School App and the Project Management Dashboard (PMD), were developed to enable students from different countries to be able to adapt their learning material by programming and designing games for their academic subjects, therefore integrating the game mechanics, dynamics, and aesthetics into the academic curriculum. This paper focuses on presenting the validation context as well as the evaluation of these tools. The Hassenzahl model and AttrakDiff survey were used for measuring users’ experience and satisfaction, and for understanding emotional responses, thus providing information that enables testing of the acceptability and usability of the developed apps. After two years of usage of code-making apps (i.e., Create@School and its pre-design version Pocket Code), the pupils processed knowledge from their academic subjects spontaneously as game-based embedded knowledge. The students demonstrated creativity, a practical approach, and enthusiasm regarding making games focused on academic content that led them to learning, using mobile devices, sensors, images, and contextual information. This approach was widely accepted by students and teachers as part of their everyday class routines. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

17 pages, 7123 KiB  
Article
Can You Ink While You Blink? Assessing Mental Effort in a Sensor-Based Calligraphy Trainer
by Bibeg Hang Limbu, Halszka Jarodzka, Roland Klemke and Marcus Specht
Sensors 2019, 19(14), 3244; https://0-doi-org.brum.beds.ac.uk/10.3390/s19143244 - 23 Jul 2019
Cited by 14 | Viewed by 4566
Abstract
Sensors can monitor physical attributes and record multimodal data in order to provide feedback. The application calligraphy trainer, exploits these affordances in the context of handwriting learning. It records the expert’s handwriting performance to compute an expert model. The application then uses the [...] Read more.
Sensors can monitor physical attributes and record multimodal data in order to provide feedback. The application calligraphy trainer, exploits these affordances in the context of handwriting learning. It records the expert’s handwriting performance to compute an expert model. The application then uses the expert model to provide guidance and feedback to the learners. However, new learners can be overwhelmed by the feedback as handwriting learning is a tedious task. This paper presents the pilot study done with the calligraphy trainer to evaluate the mental effort induced by various types of feedback provided by the application. Ten participants, five in the control group and five in the treatment group, who were Ph.D. students in the technology-enhanced learning domain, took part in the study. The participants used the application to learn three characters from the Devanagari script. The results show higher mental effort in the treatment group when all types of feedback are provided simultaneously. The mental efforts for individual feedback were similar to the control group. In conclusion, the feedback provided by the calligraphy trainer does not impose high mental effort and, therefore, the design considerations of the calligraphy trainer can be insightful for multimodal feedback designers. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

16 pages, 545 KiB  
Article
Use of Computing Devices as Sensors to Measure Their Impact on Primary and Secondary Students’ Performance
by Francisco Luis Fernández-Soriano, Belén López, Raquel Martínez-España, Andrés Muñoz and Magdalena Cantabella
Sensors 2019, 19(14), 3226; https://0-doi-org.brum.beds.ac.uk/10.3390/s19143226 - 22 Jul 2019
Cited by 4 | Viewed by 3814
Abstract
The constant innovation in new technologies and the increase in the use of computing devices in different areas of the society have contributed to a digital transformation in almost every sector. This digital transformation has also reached the world of education, making it [...] Read more.
The constant innovation in new technologies and the increase in the use of computing devices in different areas of the society have contributed to a digital transformation in almost every sector. This digital transformation has also reached the world of education, making it possible for members of the educational community to adopt Learning Management Systems (LMS), where the digital contents replacing the traditional textbooks are exploited and managed. This article aims to study the relationship between the type of computing device from which students access the LMS and how affects their performance. To achieve this, the LMS accesses of students in a school comprising from elementary to bachelor’s degree stages have been monitored by means of different computing devices acting as sensors to gather data such as the type of device and operating system used by the students.The main conclusion is that students who access the LMS improve significantly their performance and that the type of device and the operating system has an influence in the number of passed subjects. Moreover, a predictive model has been generated to predict the number of passed subjects according to these factors, showing promising results. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

20 pages, 2255 KiB  
Article
Detecting Mistakes in CPR Training with Multimodal Data and Neural Networks
by Daniele Di Mitri, Jan Schneider, Marcus Specht and Hendrik Drachsler
Sensors 2019, 19(14), 3099; https://0-doi-org.brum.beds.ac.uk/10.3390/s19143099 - 13 Jul 2019
Cited by 20 | Viewed by 5939
Abstract
This study investigated to what extent multimodal data can be used to detect mistakes during Cardiopulmonary Resuscitation (CPR) training. We complemented the Laerdal QCPR ResusciAnne manikin with the Multimodal Tutor for CPR, a multi-sensor system consisting of a Microsoft Kinect for tracking body [...] Read more.
This study investigated to what extent multimodal data can be used to detect mistakes during Cardiopulmonary Resuscitation (CPR) training. We complemented the Laerdal QCPR ResusciAnne manikin with the Multimodal Tutor for CPR, a multi-sensor system consisting of a Microsoft Kinect for tracking body position and a Myo armband for collecting electromyogram information. We collected multimodal data from 11 medical students, each of them performing two sessions of two-minute chest compressions (CCs). We gathered in total 5254 CCs that were all labelled according to five performance indicators, corresponding to common CPR training mistakes. Three out of five indicators, CC rate, CC depth and CC release, were assessed automatically by the ReusciAnne manikin. The remaining two, related to arms and body position, were annotated manually by the research team. We trained five neural networks for classifying each of the five indicators. The results of the experiment show that multimodal data can provide accurate mistake detection as compared to the ResusciAnne manikin baseline. We also show that the Multimodal Tutor for CPR can detect additional CPR training mistakes such as the correct use of arms and body weight. Thus far, these mistakes were identified only by human instructors. Finally, to investigate user feedback in the future implementations of the Multimodal Tutor for CPR, we conducted a questionnaire to collect valuable feedback aspects of CPR training. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

15 pages, 900 KiB  
Article
A Visual Dashboard to Track Learning Analytics for Educational Cloud Computing
by Diana M. Naranjo, José R. Prieto, Germán Moltó and Amanda Calatrava
Sensors 2019, 19(13), 2952; https://0-doi-org.brum.beds.ac.uk/10.3390/s19132952 - 04 Jul 2019
Cited by 22 | Viewed by 5314
Abstract
Cloud providers such as Amazon Web Services (AWS) stand out as useful platforms to teach distributed computing concepts as well as the development of Cloud-native scalable application architectures on real-world infrastructures. Instructors can benefit from high-level tools to track the progress of students [...] Read more.
Cloud providers such as Amazon Web Services (AWS) stand out as useful platforms to teach distributed computing concepts as well as the development of Cloud-native scalable application architectures on real-world infrastructures. Instructors can benefit from high-level tools to track the progress of students during their learning paths on the Cloud, and this information can be disclosed via educational dashboards for students to understand their progress through the practical activities. To this aim, this paper introduces CloudTrail-Tracker, an open-source platform to obtain enhanced usage analytics from a shared AWS account. The tool provides the instructor with a visual dashboard that depicts the aggregated usage of resources by all the students during a certain time frame and the specific use of AWS for a specific student. To facilitate self-regulation of students, the dashboard also depicts the percentage of progress for each lab session and the pending actions by the student. The dashboard has been integrated in four Cloud subjects that use different learning methodologies (from face-to-face to online learning) and the students positively highlight the usefulness of the tool for Cloud instruction in AWS. This automated procurement of evidences of student activity on the Cloud results in close to real-time learning analytics useful both for semi-automated assessment and student self-awareness of their own training progress. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

22 pages, 8529 KiB  
Article
Time Orientation Technologies in Special Education
by Miguel Angel Guillomía, Jorge Luis Falcó, José Ignacio Artigas and Mercedes García-Camino
Sensors 2019, 19(11), 2571; https://0-doi-org.brum.beds.ac.uk/10.3390/s19112571 - 06 Jun 2019
Cited by 3 | Viewed by 3546
Abstract
A device to train children in time orientation has been designed, developed and evaluated. It is framed within a long-term cooperation action between university and special education school. It uses a specific cognitive accessible time display: Time left in the day is represented [...] Read more.
A device to train children in time orientation has been designed, developed and evaluated. It is framed within a long-term cooperation action between university and special education school. It uses a specific cognitive accessible time display: Time left in the day is represented by a row of luminous elements initially on. Time passing is represented by turning off sequentially and gradually each luminous element every 15 min. Agenda is displayed relating time to tasks with standard pictograms for further accessibility. Notifications of tasks-to-come both for management support and anticipation to changes uses visual and auditory information. Agenda can be described in an Alternative and Augmentative Communication pictogram language already used by children, supporting individual and class activities on agenda. Validation has been performed with 16 children in 12 classrooms of four special education schools. Methodology for evaluation compares both prior and posterior assessments which are based in the International Classification of Functioning, Disability and Health (ICF) from the World Health Organization (WHO), together with observation registers. Results show consistent improvement in performances related with time orientation. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

15 pages, 2380 KiB  
Article
Touch-Typing Detection Using Eyewear: Toward Realizing a New Interaction for Typing Applications
by Tatsuhito Hasegawa and Tatsuya Hatakenaka
Sensors 2019, 19(9), 2022; https://0-doi-org.brum.beds.ac.uk/10.3390/s19092022 - 30 Apr 2019
Cited by 4 | Viewed by 4680
Abstract
Typing skills are important in the digital information society of this generation. As a method to improve typing speed, in this study, we focused on the training of touch typing that enables typing a key without looking at the keyboard. For support of [...] Read more.
Typing skills are important in the digital information society of this generation. As a method to improve typing speed, in this study, we focused on the training of touch typing that enables typing a key without looking at the keyboard. For support of touch-typing training, it is efficient to apply a penalty if a learner looks at the keyboard; however, to realize the penalty method, the computer needs to be able to recognize whether the learner looked at the keyboard. We, therefore, proposed a method to detect a learner’s eye gaze, namely, using eyewear to detect whether the learner looked at the keyboard, and then evaluating the detection accuracy of our proposed method. We examined the necessity for our system by analyzing the relationship between a learner’s eye gaze and touch-typing skills. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

14 pages, 2720 KiB  
Article
Data-Driven Interaction Review of an Ed-Tech Application
by Alejandro Baldominos and David Quintana
Sensors 2019, 19(8), 1910; https://0-doi-org.brum.beds.ac.uk/10.3390/s19081910 - 22 Apr 2019
Cited by 11 | Viewed by 2798
Abstract
Smile and Learn is an Ed-Tech company that runs a smart library with more that 100 applications, games and interactive stories, aimed at children aged two to 10 and their families. The platform gathers thousands of data points from the interaction with the [...] Read more.
Smile and Learn is an Ed-Tech company that runs a smart library with more that 100 applications, games and interactive stories, aimed at children aged two to 10 and their families. The platform gathers thousands of data points from the interaction with the system to subsequently offer reports and recommendations. Given the complexity of navigating all the content, the library implements a recommender system. The purpose of this paper is to evaluate two aspects of such system focused on children: the influence of the order of recommendations on user exploratory behavior, and the impact of the choice of the recommendation algorithm on engagement. The assessment, based on data collected between 15 October 2018 and 1 December 2018, required the analysis of the number of clicks performed on the recommendations depending on their ordering, and an A/B/C testing where two standard recommendation algorithms were compared with a random recommendation that served as baseline. The results suggest a direct connection between the order of the recommendation and the interest raised, and the superiority of recommendations based on popularity against other alternatives. Full article
(This article belongs to the Special Issue Advanced Sensors Technology in Education)
Show Figures

Figure 1

Back to TopTop