sensors-logo

Journal Browser

Journal Browser

Multi-Agent-Based Human-Computer Interaction

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: closed (15 January 2022) | Viewed by 4972

Special Issue Editor

Special Issue Information

Dear Colleagues,

Human–computer interaction (HCI) is concerned with the joint performance of tasks by humans and machines; the structure of communication between human and machine; human capabilities to use machines; algorithms and programming of the interface itself; engineering concerns that arise in designing and building interfaces; the process of specification, design, and implementation of interfaces; and design trade-offs.

Multi-agent systems (MAS) can be regarded as an assemblage of self-contained problem-solving systems capable of autonomous reactive, pro-active, social behavior. It applies several complex methods to adapt to new behaviors needed to achieve the intended objectives. MAS entails agents (mostly software programs that characterize actors), environments and interactions. Other key concepts to MAS are coordination and control; reasoning and planning; learning and adaptation.

This Special Issue (SI) on “Multi-Agent-Based Human–Computer Interaction” will integrate MAS and HCI where the science and technology of sensors is at the core of the proposal. The papers submitted to this SI must highlight multi-agent-based technologies and solutions centered around the use of intelligent sensors and sensor networks in current and emerging applications in human–computer interaction.

Topics of interest include (but are not limited to) the use of sensors for the following:

  • Coordination and control in multi-agent-based human–computer interaction
  • Reasoning and planning in multi-agent-based human–computer interaction
  • Learning and adaptation in multi-agent-based human–computer interaction
  • Multi-agent-based human–robot interaction
  • Human–agent interaction
  • Multi-agent-based multimodal interaction
  • Multi-agent-based social interaction
  • Multi-agent-based virtual and augmented human–computer interaction
Prof. Dr. Antonio Fernández-Caballero
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • multi-agent systems
  • human–computer interaction
  • human–robot interaction
  • virtual and augmented reality

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

18 pages, 5354 KiB  
Article
Improving Human–Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression
by Chiara Filippini, David Perpetuini, Daniela Cardone and Arcangelo Merla
Sensors 2021, 21(19), 6438; https://0-doi-org.brum.beds.ac.uk/10.3390/s21196438 - 27 Sep 2021
Cited by 18 | Viewed by 3968
Abstract
An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. [...] Read more.
An intriguing challenge in the human–robot interaction field is the prospect of endowing robots with emotional intelligence to make the interaction more genuine, intuitive, and natural. A crucial aspect in achieving this goal is the robot’s capability to infer and interpret human emotions. Thanks to its design and open programming platform, the NAO humanoid robot is one of the most widely used agents for human interaction. As with person-to-person communication, facial expressions are the privileged channel for recognizing the interlocutor’s emotional expressions. Although NAO is equipped with a facial expression recognition module, specific use cases may require additional features and affective computing capabilities that are not currently available. This study proposes a highly accurate convolutional-neural-network-based facial expression recognition model that is able to further enhance the NAO robot’ awareness of human facial expressions and provide the robot with an interlocutor’s arousal level detection capability. Indeed, the model tested during human–robot interactions was 91% and 90% accurate in recognizing happy and sad facial expressions, respectively; 75% accurate in recognizing surprised and scared expressions; and less accurate in recognizing neutral and angry expressions. Finally, the model was successfully integrated into the NAO SDK, thus allowing for high-performing facial expression classification with an inference time of 0.34 ± 0.04 s. Full article
(This article belongs to the Special Issue Multi-Agent-Based Human-Computer Interaction)
Show Figures

Figure 1

Back to TopTop