Progress in Human Computer Interaction

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (31 January 2024) | Viewed by 13244

Special Issue Editor


E-Mail Website
Guest Editor
Lapland User Experience Design Research Group (LUX), Department of Industrial Design, University of Lapland, Rovaniemi, Finland
Interests: information science; human-computer interaction; user experience interaction; rapid prototyping; human centered computing; ubiquitous computing; interaction design; multimodal interaction; user interface; wearable computing; UI design

Special Issue Information

Dear Colleagues,

Humans and computers have had a complex relationship throughout history and our interaction with these systems has been fraught with difficulty. As our computers and mechatronic systems become more sophisticated, our interactions, problems, and solutions to these problems grow in complexity. A myriad of technical, personal, and societal difficulties have given rise to a myriad of solutions and areas of research to address these difficulties, which often give rise to entirely novel solutions and even new fields of research.

The main aim of this Special Issue is to seek high-quality submissions that highlight emerging methods of identifying the nature of human–computer interactions (HCI), quantitatively and qualitatively evaluating the relative risks and merits of these interactions, and studying the possible solutions. This issue especially welcomes papers on the future of both human-centered approaches with consideration of human factors and values, as well as technology-driven contributions with technical papers addressing the recent development of advanced processing techniques (recognition of gestures, speech, emotions, activities, etc.) and interaction techniques (adaptive user interfaces, affective user interfaces, multimodal interaction, tangible and gestural interaction, interaction in virtual and augmented reality environments, etc.). Position papers and state of the art reviews will be especially welcome.

Topics of interest include, but are not limited to:

  • Assistive technologies
  • Augmentative technologies
  • Robotics: companion robots, workplace robots, healthcare robots, and soft robots
  • Wearables: active and passive orthotics, prosthetics, exoskeletons, and wearable sensors
  • Gesture recognition and virtual reality
  • Social issues in human–computer interactions

Dr. Ashley Colley
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • human behavior understanding (HBU)
  • artificial intelligence
  • deep learning
  • wearable computing
  • ubiquitous computing
  • ambient computing
  • brain computer/machine interface
  • data mining and statistical analysis
  • cognitive modeling
  • social information processing

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

27 pages, 1567 KiB  
Article
Usability Testing of Mobile Applications: A Methodological Framework
by Paweł Weichbroth
Appl. Sci. 2024, 14(5), 1792; https://0-doi-org.brum.beds.ac.uk/10.3390/app14051792 - 22 Feb 2024
Viewed by 1414
Abstract
Less than five percent of all mobile applications have become successful throughout 2023. The success of a new mobile application depends on a variety of factors ranging from business understanding, customer value, and perceived quality of use. In this sense, the topic of [...] Read more.
Less than five percent of all mobile applications have become successful throughout 2023. The success of a new mobile application depends on a variety of factors ranging from business understanding, customer value, and perceived quality of use. In this sense, the topic of usability testing of mobile applications is relevant from the point of view of user satisfaction and acceptance. However, the current knowledge seems to be fragmented, scattered across many papers and reports, and sometimes poorly documented. This paper attempts to fill this gap by investigating the current state of knowledge by reviewing the previous literature relevant to the research topic and developing a unified view. In particular, the methodological framework is outlined and discussed, including the discourse on settings for laboratory and field studies, data collection techniques, experimental designs for mobile usability testing, and a generic research framework. Therefore, the paper contributes to both the theory and practice of human–computer interaction by providing methodological foundations for usability testing of mobile applications, paving the way for further studies in this area. Moreover, the paper provides a better understanding of the related topics, in particular shedding light on methodological foundations, key concepts, challenges, and issues, equipping readers with a comprehensive knowledge base to navigate and contribute to the advancement of the field of mobile usability. Full article
(This article belongs to the Special Issue Progress in Human Computer Interaction)
Show Figures

Figure 1

27 pages, 13876 KiB  
Article
Improving Usability in Mobile Apps for Residential Energy Management: A Hybrid Approach Using Fuzzy Logic
by Ivonne Nuñez, Elia Esther Cano, Edmanuel Cruz, Dimas Concepción, Nila Navarro and Carlos Rovetto
Appl. Sci. 2024, 14(5), 1751; https://0-doi-org.brum.beds.ac.uk/10.3390/app14051751 - 21 Feb 2024
Viewed by 544
Abstract
This paper presents a study that evaluates the usability and user experience of a mobile application interface for residential energy management, adopting a hybrid approach that integrates quantitative and qualitative methods within a user-centered design framework. For the evaluation, metrics and tools such [...] Read more.
This paper presents a study that evaluates the usability and user experience of a mobile application interface for residential energy management, adopting a hybrid approach that integrates quantitative and qualitative methods within a user-centered design framework. For the evaluation, metrics and tools such as the User Experience Questionnaire Short (UEQ-S) and the System Usability Scale (SUS) were used, in addition to the implementation of a fuzzy logic model to interpret and contrast the data obtained through these metrics, allowing a more accurate assessment of usability and user experience, reflecting the variability and trends in the responses. Three aspects evaluated stand out: satisfaction with the interface, ease of use, and efficiency. These are fundamental to understanding how users perceive the system. The results indicate a high likelihood of user recommendation of the system and a high overall quality of user experience. This study significantly contributes to mobile application usability, especially in residential energy management, offering valuable insights for designing more intuitive and effective user interfaces on mobile devices. Full article
(This article belongs to the Special Issue Progress in Human Computer Interaction)
Show Figures

Figure 1

30 pages, 5673 KiB  
Article
Research on Equipment and Algorithm of a Multimodal Perception Gameplay Virtual and Real Fusion Intelligent Experiment
by Lurong Yang, Jie Yuan and Zhiquan Feng
Appl. Sci. 2022, 12(23), 12184; https://0-doi-org.brum.beds.ac.uk/10.3390/app122312184 - 28 Nov 2022
Cited by 1 | Viewed by 1203
Abstract
Chemistry experiments are an important part of chemistry learning, and the development and application of virtual experiments have greatly enriched experimental teaching. However, in the existing virtual experiments, there are problems such as low human–computer interaction efficiency, poor user sense of reality and [...] Read more.
Chemistry experiments are an important part of chemistry learning, and the development and application of virtual experiments have greatly enriched experimental teaching. However, in the existing virtual experiments, there are problems such as low human–computer interaction efficiency, poor user sense of reality and operation, and a boring experimental process. Therefore, this paper designs a multimodal perception gameplay virtual and real fusion intelligence laboratory (GVRFL). GVRFL uses virtual and real fusion methods to interactively complete chemical experiments, which greatly improves the user’s sense of reality and operation. This method proposes a multimodal intention active understanding algorithm to improve the efficiency of human–computer interaction and user experience and proposes a novel game-based virtual–real fusion intelligent experimental mode that adds gameplay to the process of virtual–real fusion experiments. The experimental results show that this method improves the efficiency of human–computer interaction and reduces the user’s operating load. At the same time, the interaction between the real experimental equipment and the virtual experimental scene greatly improves the user’s sense of reality and operation. The introduction of game elements into the process of virtual and real fusion experiments stimulates students’ interest in and enthusiasm for learning. Full article
(This article belongs to the Special Issue Progress in Human Computer Interaction)
Show Figures

Figure 1

22 pages, 10295 KiB  
Article
GrowBot: An Educational Robotic System for Growing Food
by Henrik Hautop Lund, Martin Exner, Nikolai Eskild Jensen, Massimiliano Leggieri, Malene Outzen, Gitte Ravn-Haren, Malte von Sehested, Andreas Væring and Rikke Andersen
Appl. Sci. 2022, 12(11), 5539; https://0-doi-org.brum.beds.ac.uk/10.3390/app12115539 - 30 May 2022
Cited by 3 | Viewed by 2624
Abstract
We present the GrowBot as an educational robotic system to facilitate hands-on experimentation with the control of environmental conditions for food plant growth. The GrowBot is a tabletop-sized greenhouse automated with sensors and actuators to become a robotic system for the control of [...] Read more.
We present the GrowBot as an educational robotic system to facilitate hands-on experimentation with the control of environmental conditions for food plant growth. The GrowBot is a tabletop-sized greenhouse automated with sensors and actuators to become a robotic system for the control of plant’s growth. The GrowBot includes sensors for humidity, CO2, temperature, water level, RGB camera images, and actuators to control the grow conditions, including full spectrum lights, IR lights, and UV lights, nutrients pump, water pump, air pump, air change pump, and fan. Inspired by educational robotics, we developed user-friendly graphical programming of the GrowBots on several means: a touch display, a micro:bit, and a remote webserver interface. This allows school pupils to easily program the GrowBots to different growth conditions for the natural plants in terms of temperature, humidity, day light cycle, wavelength of LED light, nutrient rate, etc. The GrowBot system also allows the user to monitor the environmental conditions, such as CO2 monitoring for photosynthesis understanding, on both the touch display and the remote web–interface. An experiment with nine GrowBots shows that the different parameters can be controlled, that this can control the growth of the food plants, and that control to make an environmental condition with blue light results in higher and larger plants than red light. Further, the pilot experimentation in school settings indicates that the comprehensive system design method results in a deployable system, which can become well adopted in the educational domain. Full article
(This article belongs to the Special Issue Progress in Human Computer Interaction)
Show Figures

Figure 1

22 pages, 4497 KiB  
Article
Arm Posture Changes and Influences on Hand Controller Interaction Evaluation in Virtual Reality
by Xiaolong Lou, Qinping Zhao, Yan Shi and Preben Hansen
Appl. Sci. 2022, 12(5), 2585; https://0-doi-org.brum.beds.ac.uk/10.3390/app12052585 - 02 Mar 2022
Cited by 4 | Viewed by 4016
Abstract
In virtual reality (VR) applications, hand-controller interaction is largely limited by the biomechanical structure of the arm and its kinematical features. Earlier research revealed that different arm postures generate distinct arm fatigue levels in mid-air operational tasks; however, how they impact interaction performance, [...] Read more.
In virtual reality (VR) applications, hand-controller interaction is largely limited by the biomechanical structure of the arm and its kinematical features. Earlier research revealed that different arm postures generate distinct arm fatigue levels in mid-air operational tasks; however, how they impact interaction performance, e.g., accuracy of target grasp and manipulation, has been less investigated. To fill this gap in knowledge, we conducted an empirical experiment in which thirty participants were recruited to complete a series of target acquisition tasks in a specifically designed VR application. Results show that (1) a bent arm posture resulted in a higher interaction accuracy than a stretched arm posture; (2) a downward arm posture interacted more accurately than an upraised arm posture; since two arms are bilaterally symmetric, (3) either selected arm interacted more accurately on the corresponding side than on the opposite side; and (4) the user-preferred or dominant arm interacted more persistently than the non-dominant one, though two arms generated little difference in interaction accuracy. Implications and suggestions are discussed for designing more efficient and user-satisfying interactive spaces in VR. Full article
(This article belongs to the Special Issue Progress in Human Computer Interaction)
Show Figures

Figure 1

Review

Jump to: Research

12 pages, 973 KiB  
Review
Inspiring Real-Time Evaluation and Optimization of Human–Robot Interaction with Psychological Findings from Human–Human Interaction
by Huashuo Liu, Fei Wang and Dan Zhang
Appl. Sci. 2023, 13(2), 676; https://0-doi-org.brum.beds.ac.uk/10.3390/app13020676 - 04 Jan 2023
Cited by 4 | Viewed by 2271
Abstract
The increasingly central role of robotic agents in daily life requires effective human–robot interaction (HRI). For roboticists to optimize interaction design, it is crucial to understand the potential effects of robotic agents on human performance. Yet a systematic specification of contributing factors is [...] Read more.
The increasingly central role of robotic agents in daily life requires effective human–robot interaction (HRI). For roboticists to optimize interaction design, it is crucial to understand the potential effects of robotic agents on human performance. Yet a systematic specification of contributing factors is lacking, and objective measures of HRI performance are still limited. In these regards, the findings of research on human–human interaction can provide valuable insights. In this review, we break down the complex effects of robotic agents on interacting humans into some basic building blocks based on human–human interaction findings, i.e., the potential effects of physical presence, motor actions, and task co-representation in HRI. For each effect, we advise on future directions regarding its implication. Furthermore, we propose that the neural correlates of these effects could support real-time evaluation and optimization of HRI with electroencephalograph (EEG)-based brain–computer interface (BCI). Full article
(This article belongs to the Special Issue Progress in Human Computer Interaction)
Show Figures

Figure 1

Back to TopTop