Previous Issue
Volume 8, June
 
 

Multimodal Technol. Interact., Volume 8, Issue 7 (July 2024) – 2 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
14 pages, 680 KiB  
Review
Data Governance in Multimodal Behavioral Research
by Zhehan Jiang, Zhengzhou Zhu and Shucheng Pan
Multimodal Technol. Interact. 2024, 8(7), 55; https://0-doi-org.brum.beds.ac.uk/10.3390/mti8070055 - 25 Jun 2024
Viewed by 129
Abstract
In the digital era, multimodal behavioral research has emerged as a pivotal discipline, integrating diverse data sources to comprehensively understand human behavior. This paper defines and distinguishes data governance from mere data management within this context, highlighting its centrality in assuring data quality, [...] Read more.
In the digital era, multimodal behavioral research has emerged as a pivotal discipline, integrating diverse data sources to comprehensively understand human behavior. This paper defines and distinguishes data governance from mere data management within this context, highlighting its centrality in assuring data quality, ethical handling, and participant protection. Through a meticulous review of the literature and empirical experience, we identify key implementation strategies and elucidate the benefits and risks of data governance frameworks in multimodal research. A demonstrative case study illustrates the practical applications and challenges, revealing enhanced data reliability and research integrity as tangible outcomes. Our findings underscore the critical need for robust data governance, pointing to future advancements in the field, including the development of adaptive governance frameworks, innovative big data analytics solutions, and user-friendly tools. These enhancements are poised to amplify the utility of multimodal data, propelling behavioral science forward. Full article
Show Figures

Figure 1

26 pages, 2054 KiB  
Article
Emotion-Aware In-Car Feedback: A Comparative Study
by Kevin Fred Mwaita, Rahul Bhaumik, Aftab Ahmed, Adwait Sharma, Antonella De Angeli and Michael Haller
Multimodal Technol. Interact. 2024, 8(7), 54; https://0-doi-org.brum.beds.ac.uk/10.3390/mti8070054 - 25 Jun 2024
Viewed by 122
Abstract
We investigate personalised feedback mechanisms to help drivers regulate their emotions, aiming to improve road safety. We systematically evaluate driver-preferred feedback modalities and their impact on emotional states. Using unobtrusive vision-based emotion detection and self-labeling, we captured the emotional states and feedback preferences [...] Read more.
We investigate personalised feedback mechanisms to help drivers regulate their emotions, aiming to improve road safety. We systematically evaluate driver-preferred feedback modalities and their impact on emotional states. Using unobtrusive vision-based emotion detection and self-labeling, we captured the emotional states and feedback preferences of 21 participants in a simulated driving environment. Results show that in-car feedback systems effectively influence drivers’ emotional states, with participants reporting positive experiences and varying preferences based on their emotions. We also developed a machine learning classification system using facial marker data to demonstrate the feasibility of our approach for classifying emotional states. Our contributions include design guidelines for tailored feedback systems, a systematic analysis of user reactions across three feedback channels with variations, an emotion classification system, and a dataset with labeled face landmark annotations for future research. Full article
Previous Issue
Back to TopTop