Previous Issue
Volume 8, May
 
 

Multimodal Technol. Interact., Volume 8, Issue 6 (June 2024) – 4 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
17 pages, 1663 KiB  
Article
What the Mind Can Comprehend from a Single Touch
by Patrick Coe, Grigori Evreinov, Mounia Ziat and Roope Raisamo
Multimodal Technol. Interact. 2024, 8(6), 45; https://0-doi-org.brum.beds.ac.uk/10.3390/mti8060045 - 28 May 2024
Viewed by 274
Abstract
This paper investigates the versatility of force feedback (FF) technology in enhancing user interfaces across a spectrum of applications. We delve into the human finger pad’s sensitivity to FF stimuli, which is critical to the development of intuitive and responsive controls in sectors [...] Read more.
This paper investigates the versatility of force feedback (FF) technology in enhancing user interfaces across a spectrum of applications. We delve into the human finger pad’s sensitivity to FF stimuli, which is critical to the development of intuitive and responsive controls in sectors such as medicine, where precision is paramount, and entertainment, where immersive experiences are sought. The study presents a case study in the automotive domain, where FF technology was implemented to simulate mechanical button presses, reducing the JND FF levels that were between 0.04 N and 0.054 N to the JND levels of 0.254 and 0.298 when using a linear force feedback scale and those that were 0.028 N and 0.033 N to the JND levels of 0.074 and 0.164 when using a logarithmic force scale. The results demonstrate the technology’s efficacy and potential for widespread adoption in various industries, underscoring its significance in the evolution of haptic feedback systems. Full article
Show Figures

Figure 1

17 pages, 3964 KiB  
Article
A Wearable Bidirectional Human–Machine Interface: Merging Motion Capture and Vibrotactile Feedback in a Wireless Bracelet
by Julian Kindel, Daniel Andreas, Zhongshi Hou, Anany Dwivedi and Philipp Beckerle
Multimodal Technol. Interact. 2024, 8(6), 44; https://0-doi-org.brum.beds.ac.uk/10.3390/mti8060044 - 23 May 2024
Viewed by 382
Abstract
Humans interact with the environment through a variety of senses. Touch in particular contributes to a sense of presence, enhancing perceptual experiences, and establishing causal relations between events. Many human–machine interfaces only allow for one-way communication, which does not do justice to the [...] Read more.
Humans interact with the environment through a variety of senses. Touch in particular contributes to a sense of presence, enhancing perceptual experiences, and establishing causal relations between events. Many human–machine interfaces only allow for one-way communication, which does not do justice to the complexity of the interaction. To address this, we developed a bidirectional human–machine interface featuring a bracelet equipped with linear resonant actuators, controlled via a Robot Operating System (ROS) program, to simulate haptic feedback. Further, the wireless interface includes a motion sensor and a sensor to quantify the tightness of the bracelet. Our functional experiments, which compared stimulation with three and five intensity levels, respectively, were performed by four healthy participants in their twenties and thirties. The participants achieved an average accuracy of 88% estimating three vibration intensity levels. While the estimation accuracy for five intensity levels was only 67%, the results indicated a good performance in perceiving relative vibration changes with an accuracy of 82%. The proposed haptic feedback bracelet will facilitate research investigating the benefits of bidirectional human–machine interfaces and the perception of vibrotactile feedback in general by closing the gap for a versatile device that can provide high-density user feedback in combination with sensors for intent detection. Full article
Show Figures

Figure 1

19 pages, 3849 KiB  
Article
Exploring the Role of User Experience and Interface Design Communication in Augmented Reality for Education
by Matina Kiourexidou, Andreas Kanavos, Maria Klouvidaki and Nikos Antonopoulos
Multimodal Technol. Interact. 2024, 8(6), 43; https://0-doi-org.brum.beds.ac.uk/10.3390/mti8060043 - 22 May 2024
Viewed by 363
Abstract
Augmented Reality (AR) enhances learning by integrating interactive and immersive elements that bring content to life, thus increasing motivation and improving retention. AR also supports personalized learning, allowing learners to interact with content at their own pace and according to their preferred learning [...] Read more.
Augmented Reality (AR) enhances learning by integrating interactive and immersive elements that bring content to life, thus increasing motivation and improving retention. AR also supports personalized learning, allowing learners to interact with content at their own pace and according to their preferred learning styles. This adaptability not only promotes self-directed learning but also empowers learners to take charge of their educational journey. Effective interface design is crucial for these AR applications, requiring careful integration of user interactions and visual cues to blend AR elements seamlessly with reality. This paper explores the impact of AR on user experience within educational settings, examining engagement, motivation, and learning outcomes to determine how AR can enhance the educational experience. Additionally, it addresses design considerations and challenges in developing AR user interfaces, drawing on current research and best practices to propose effective and adaptable solutions for educational AR applications. As AR technology evolves, its potential to transform educational experiences continues to grow, promising significant advancements in how users interact with, personalize, and immerse themselves in learning content. Full article
Show Figures

Figure 1

14 pages, 2937 KiB  
Article
Recall of Odorous Objects in Virtual Reality
by Jussi Rantala, Katri Salminen, Poika Isokoski, Ville Nieminen, Markus Karjalainen, Jari Väliaho, Philipp Müller, Anton Kontunen, Pasi Kallio and Veikko Surakka
Multimodal Technol. Interact. 2024, 8(6), 42; https://0-doi-org.brum.beds.ac.uk/10.3390/mti8060042 - 21 May 2024
Viewed by 343
Abstract
The aim was to investigate how the congruence of odors and visual objects in virtual reality (VR) affects later memory recall of the objects. Participants (N = 30) interacted with 12 objects in VR. The interaction was varied by odor congruency (i.e., the [...] Read more.
The aim was to investigate how the congruence of odors and visual objects in virtual reality (VR) affects later memory recall of the objects. Participants (N = 30) interacted with 12 objects in VR. The interaction was varied by odor congruency (i.e., the odor matched the object’s visual appearance, the odor did not match the object’s visual appearance, or the object had no odor); odor quality (i.e., an authentic or a synthetic odor); and interaction type (i.e., participants could look and manipulate or could only look at objects). After interacting with the 12 objects, incidental memory performance was measured with a free recall task. In addition, the participants rated the pleasantness and arousal of the interaction with each object. The results showed that the participants remembered significantly more objects with congruent odors than objects with incongruent odors or odorless objects. Furthermore, interaction with congruent objects was rated significantly more pleasant and relaxed than interaction with incongruent objects. Odor quality and interaction type did not have significant effects on recall or emotional ratings. These results can be utilized in the development of multisensory VR applications. Full article
Show Figures

Figure 1

Previous Issue
Back to TopTop