Next Article in Journal
A Human-Computer Control System Based on Intelligent Recognition of Eye Movements and Its Application in Wheelchair Driving
Next Article in Special Issue
Comply with Me: Using Design Manipulations to Affect Human–Robot Interaction in a COVID-19 Officer Robot Use Case
Previous Article in Journal
Augmented Reality for Autistic Children to Enhance Their Understanding of Facial Expressions
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Pattern Approach to Comprehensible and Pleasant Human–Robot Interaction

Fraunhofer Institute for Industrial Engineering IAO, Nobelstr. 12, 70569 Stuttgart, Germany
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2021, 5(9), 49; https://0-doi-org.brum.beds.ac.uk/10.3390/mti5090049
Submission received: 3 June 2021 / Revised: 6 August 2021 / Accepted: 17 August 2021 / Published: 27 August 2021
(This article belongs to the Special Issue User Experience in Human-Robot Interaction)

Abstract

:
HRI designers are faced with the task of creating robots that are easy and pleasant to use for the users. The growing body of research in human–robot interaction (HRI) is still mainly focused on technical aspects of the interaction. It lacks defined guidelines that describe how behavioral expressions for social robots need to be designed to promote high usability and positive user experience. To achieve this goal, we propose to apply the concept of design patterns to HRI. We present a design process that provides step-by-step guidance and methods for HRI designers to generate high quality behavioral patterns for social robots that can be used for different robots and use cases. To document the resulting patterns, we developed a documentation format that provides a clear, standardized structure to note down all relevant aspects of a pattern so that others can understand its design recommendations and apply them to their own robot and use cases. In the present paper, we demonstrate our pattern approach based on an example and describe how we arrived at a pattern language of 40 behavioral patterns that found the basis for future social robot design and related research activities.

1. Introduction

Robots can be used to provide assistance in different areas of our daily lives such as domestic environments [1], teaching [2] or elder care [3]. Robot manufacturers and developers are constantly working on improving the skills of collaborative robots in order to ensure a smooth interaction between a robot and a user. Thus, robots are now capable of actions such as accompanying humans and handing over objects. However, when designing successful human–robot interaction (HRI), it is not only necessary to develop the required technical skills but also ensure a high usability and positive user experience. People will only accept and frequently use a robot if the interaction is experienced as intuitive, pleasant and meaningful for the user. Moreover, previous research has indicated that people are more willing to interact with a robot that shows some social behavior [4]. A social robot can be characterized as a robot that “interacts and communicates with humans by following the behavioral norms expected by the people with whom the robot is intended to interact.” [5]. Consequently, the interactive behavior of a social robot has to fulfill complex requirements.
In human–human interaction, the way we perceive our social interaction partner largely depends on our previous interaction experiences and expectations. The human brain is trained to recognize and interpret behavioral patterns of other humans. This internalized mechanism of pattern expression and recognition works instantaneously and effortlessly: We automatically know what the other person wants to express and automatically show certain verbal and non-verbal behavior ourselves in order to achieve a particular communication goal. Interaction with intelligent technology and autonomous systems such as robots works similarly: By using a system, we learn how it behaves in response to our input and gradually adjust our expectations towards the system. This learning process can, however, only take place if the system behaves in a consistent way, i.e., that it shows recurring behavioral patterns that are easy to understand and interpret by the user. This concept of behavioral patterns should ideally be applied to different robot appearances and use cases, thus ensuring a seamless interaction between different robots and tasks. To achieve this goal, a shared behavioral language for social robots needs to be developed. Shared design languages are well-known in the area of graphical user interfaces where design knowledge and best practices are often formalized and documented in such way that they can easily be reused by others. This is, however, not the case for HRI, where design knowledge is often fragmented. Many insights are obtained by conducting user studies that focus on either a specific robotic platform or appearance (e.g., [6,7,8]), a particular communication modality (e.g., [9,10]) or a specific use case (e.g., [11,12]). Each research group uses their findings to develop their own behavioral expressions for a particular robot and specific interaction situation, thereby producing a broad variety of behavioral expressions for social robots. While this is not a problem for the HRI research community per se, it can have negative effects on the overall interaction experience of the users who have to learn the interaction “language” of each robot individually.
In the present paper, we describe our approach to applying the concept of behavioral patterns to HRI with the goal to create a methodology and structured format to document reusable design suggestions for recurring communication goals, i.e., behaviors of robots that humans encounter frequently and in different interaction scenarios. Our pattern approach includes a defined design process. This process guides HRI researchers and designers through the different steps of pattern creation: identifying recurring interaction situations, gathering existing insights from HRI research and design best practices, transferring them into reusable behavioral expressions for social robots, and iteratively testing and improving these patterns. The generated behavioral expressions are documented as behavioral patterns. Each pattern includes design recommendations that are applicable for different types of robots and various interaction scenarios. Thereby, the pattern approach provides a structured method to consolidate existing design knowledge in HRI, which—in the long run—reduces the workload for HRI designers and enhances the user experience (UX).

2. Related Work

2.1. Human–Robot Interaction Design

As for any other interactive technology, the goal of HRI design should be to create robots that offer a high usability and promote a positive UX. This means that, for one, the robot’s behavior supports the user in achieving their goals in an effective, efficient and satisfactory way [13]. We hereafter refer to behavioral expressions of social robots that offer a high usability as comprehensible. The concept of usability is extended by the concept of UX, which comprises a “person’s perceptions and responses resulting from the use and/or anticipated use of a product, system or service” [13]. Interaction design aims to promote a positive evaluative feeling during the interaction [14]. In this work, behavioral expressions for social robots that provide a positive UX are referred to as pleasant.
A high usability and positive UX can positively influence the frequency of use, product bonding, error tolerance and overall acceptance [15]. Designing for usability and UX can hence be considered crucial in HRI design. To this end, four different approaches can be applied:
  • Robot-centered HRI bases design decisions on the assumption that the robot is an agent with its own goals. Behavior design aims to equip the robot for “surviving in the environment” or “fulfilling internal needs” (e.g., emotions) [16].
  • Robot cognition-centered HRI thinks of the robot as an intelligent system that can make decisions and solve problems. The behavior of the robot is equated with the behavior of the software system. Behavior design thus evolves around questions of machine learning, problem solving, and cognitive robot architectures [16].
  • Design-centered HRI refers to the robot as a product that was created to provide the user with a certain experience. To this end, form, modality, social norms, autonomy and interactivity are defined as relevant characteristics of the robot [5].
  • Human-centered HRI puts people’s reactions and interpretations of the robot’s appearance and behavior in the center of attention. It focuses on the goal of designing robot behavior that is acceptable and comfortable for the human interaction partner [16].
The robot-centered and robot cognition-centered approaches are primarily technology-driven, focusing on developing and improving technical skills of the robot in order to facilitate HRI. While these technical skills are important to ensure high usability, a technology-centered perspective does not directly contribute to UX, as the user is not part of the design process. The design-centered view stresses the relevance of the user. However, it is rather aimed at classifying robot behaviors than developing design recommendations for the behavior of social robots. When working on the design of behavioral expressions for social robots, one should therefore adopt the human-centered approach. In the context of robot behavior, the design process should start with questions such as: Which impression should the robot make on the user? Which information should it convey with its behavior?

2.2. Behavioral Expressions for Social Robots

A robot typically consists of a software system and a physical body. When designing robots, it is hence inevitable to consider their appearance and non-verbal behavior. Therefore, the question of how to design expressive robot behavior is an important part of HRI research. In this connection, expressive behavior is understood as a configuration of the robot’s actuators to convey information or visualize an internal state using body parts of the robot. Each behavioral expression serves a certain communication goal. Table 1 provides an overview of the actuators and communication modalities that are generally available for the behavioral design of social robots.
The research field of HRI is very diverse: There are a lot of different robot forms and robotic platforms, various actuators and communication modalities and many different use cases for social robots. Thus, researchers usually set a narrow scope for their work, focusing either on a specific robot, one or few particular modalities and a predefined use case. As a result, the behavioral expressions that they design cannot easily be transferred to other scenarios of use. The following examples illustrate the heterogeneity of existing research:
  • Robot-specific expressions: Social robots can have different appearances—humanoid (human-like), animoid (animal-like) and abstract robots (not inspired by the form of any living organism). Studies in HRI often concentrate on a particular robot, which can be self-built or commercially available. Examples of humanoid robots are Softbank’s Nao (SoftBank Robotics), Honda’s Asimo (American Honda Motor Co. Inc.) or Furhat (Furhat Robotics). Prominent examples for animoid robots in HRI research are the robots iCat [17], Aibo (Sony Corporation), Paro (Paro Robotics) or Kismet [4]. Abstract robots are hardly ever the object of investigation in social robot studies.
  • Modality-specific expressions: Reviewing the body of research, we found that scientific studies on the behavioral design for social robots tend to focus on selected communication modalities (instead of using the full range available as displayed in Table 1). There is, for example, extensive research about gaze behavior of humanoid robots [9,18,19]. Furthermore, gestures, sound and light are often researched in combinations, especially in connection with emotional expressions for humanoid robots [7,20,21]. Other modalities that have received increased interest are proxemics (physical distance between human and robot) [10,22] and speech.
  • Use Case-specific expressions: Social robots can be applied in different settings and assume various roles. Common application areas for social robots in HRI research are playing games [7,23], teaching [11,24] or treating autistic children [25,26]. Simple collaboration tasks are also often used to investigate the interaction between humans and robots [27,28].
To our knowledge, no approach exists that integrates all three levels of behavioral design.

2.3. Design Patterns

As mentioned above, social interactions between humans are basically composed of behavioral patterns. The concept of patterns as building blocks for interactions also applies to human–technology interaction (HTI). As in human–human interaction, we develop expectations towards a technical system by observing and remembering consistent behaviors that it shows. In different HTI domains, design patterns have been proposed as a method to record these behaviors in a structured way. The generation of design patterns in HTI is based on an analysis of existing products (e.g., software applications, websites or interactive technology), during which proven solutions to common design problems are identified. By noting down these solutions as patterns, they are documented in a structured way and made available as building blocks for future design solutions.
The concept of design patterns was first mentioned by Alexander [29], who was looking for a way to assemble, document and structure architectural design knowledge. He established the following definition of design patterns: “Each pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem, in such way that you can use this solution a million times over, without ever completed it the same way twice. […]” [29] p. x. Alexander did not claim that each pattern describes the ideal solution for the design problem at hand but rather proposed to start documenting existing design knowledge and iteratively improve it: “The patterns are very much alive and evolving—in fact […] each pattern may be looked on as a hypothesis […]. In this sense, each pattern represents our current best guess as to what arrangement of the physical environment will work to solve the problem presented.” [29] p. xv. Alexander’s patterns were mainly described as a text, structured in different paragraphs (pattern name, validity ranking, context, problem description and solution), and accompanied by a photo and diagram of the design solution. He also defined dependencies between different patterns, thus forming a pattern language.
The pattern approach has been successfully transferred to different application areas in the field of HTI, such as software design [30], interface design [31] and interactive exhibits [32]. Borchers also proposed a pattern language for the entire HTI community to share successful design solutions among professionals and to provide a common language for HTI design to anyone involved in the design, development, evaluation, or use of interactive systems [32].
Pattern approaches offer a methodology to document design knowledge in a structured and explicit way, which contributes to the generation of reusable design knowledge. At the same time, this documentation enables a knowledge exchange between different disciplines. Still, it appears to be challenging to apply the process of pattern generation in HTI to HRI. Based on Alexander’s initial pattern approach, patterns for software or interface design are generated by comparing and analyzing existing applications (such as software programs or websites) and extracting design solutions that have been proven successful. Social robots are still an emerging technology, and there are no existing proven solutions that can be analyzed. On the contrary, in this context, patterns can only be understood as a way of documenting design guidelines for future realization of social robots. Nevertheless, the concept of design patterns is not new to the HRI community. Some propositions have been made to apply the pattern approach described above to HRI.
Kahn et al. [8] have proposed eight design patterns for HRI, using the following methods: The patterns were developed based on the analysis of human social interactions and child–robot interactions, including insights from experiments, philosophical considerations and own observations. They were then iteratively improved through laboratory studies with children and the humanoid robot Robovie. Each pattern describes recurring interaction situations (e.g., “initial introduction”, “in motion together”, “reciprocal turn-taking in game context”). The patterns are noted down as text and are generic, so that they can be applied to different robots. However, the pattern description does not provide any hints about how to design the expressive behavior of the robot. It contains a short description of the pattern, which is illustrated by an example dialog between user and robot encountered during the laboratory studies. The use of other modalities than speech is not made explicit for most of the patterns.
Petalson and Wrede [33,34] make use of interaction patterns in their Pamini (pattern-based mixed-initiative) framework for dialogue modeling. They propose 28 generic dialogue patterns, i.e., abstract descriptions of recurring dialogue structures human–robot communication (e.g., “making a proposal”, “requesting information”). Other modalities are not included. The patterns are documented as transducers using state charts, which support the use of the patterns in software engineering. As this work primarily makes use of patterns for the sake of modeling, detailed specifications in the form of design recommendations are not provided.
Sauppé and Mutlu [35] examined how interaction design patterns can be used for exploring and prototyping HRI. The authors understand design patterns as recurring verbal structures that can be used as building blocks to define the interaction flow between human and robot on a conceptual level. Based on a formative observation study of human–human conversations, they derived seven patterns that can be used to prototype interaction flows with the Nao robot in the Interaction Blocks authoring environment. A qualitative evaluation study showed that these patterns can facilitate HRI prototyping processes. Recommendations for the concrete verbal and non-verbal behavior of the robot are not provided.
While this existing research highlights the potential of patterns in HRI design, it mainly focuses on verbal aspects of interaction design. Our research extends the previous work by demonstrating how the concept of patterns can be used to aggregate and document design knowledge about non-verbal aspects of HRI and create reusable behavioral expressions for social robots. To this end, we propose a pattern approach to HRI design consisting of three components:
  • A structured, user-centered design process that provides step-by-step guidance for how to create non-verbal behavioral expressions that are perceived as comprehensible and pleasant by the users,
  • A format and conventions for documenting the created patterns and connecting them to form a shared HRI design language, and
  • A pattern language for social robots that demonstrates how the design process and documentation format can be used in practice and contains 40 patterns with concrete recommendations for the design of non-verbal behaviors for robots that offer companionship and entertainment.
In Section 3, we describe the different steps of the design process in detail. We present methods and materials used to support each step and illustrate their application based on an example use case. The formal notation format that can be used to document the resulting pattern ideas is shown and explained in Section 4. Finally, in Section 5, we present the pattern language for social robots that was created using the design process and notation format. The pattern approach was first introduced in our previous work [36]. The present paper presents a revised and more detailed version of the initial approach, including an example use case and user study.

3. How to Create Behavioral Design Patterns for Human–Robot Interaction

The proposed pattern design process is based on the traditional human-centered design process (HCD) [13]. The HCD proposes an iterative design of interactive systems and typically contains four phases: The Analysis phase is aimed at understanding the context of use including the users, tasks and characteristics of the environment. In the Interpretation phase, user requirements are specified based on the insights of the Analysis phase. Next, in the Design phase, prototypes and design solutions are developed to meet these user requirements. Finally, in the Evaluation phase, the developed solutions are tested against the user requirements. Our pattern design process follows similar principles but adjusts the different steps to meet the domain-specific requirements of HRI by guiding the design team through the process of generating reusable behavioral expressions for social robots in a bottom-up way. As described above, other pattern approaches in HTI follow a top-down process and derive patterns from the pool of available design solutions. Our design approach builds upon the methods proposed by Kahn et al. [8], who suggest using a bottom-up process that builds on empirical research in HRI, philosophical considerations and conventions in human–human interaction. We extend this idea of a bottom-up pattern generation and include instructions for the design team on how to gather relevant research results, best practices and insights and transfer them into a particular behavioral expression. The process consists of three phases with three steps each (Figure 1), which will be described in detail in the next sections.

3.1. Example Use Case

To illustrate the different steps of the design process, we use the example of a quiz game. In this use case, the robot acts as a quiz master who challenges the user with questions. The goal of the quiz game is to entertain the user, while at the same time providing an opportunity for brain training. To serve this goal, the robot not only poses the quiz questions but also acts as a kind of coach who motivates the user to continuously engage in brain training.

3.2. Phase 1: Analysis

The goal of the Analysis phase is to identify those interaction situations for which it makes sense to develop reusable design solutions in the form of patterns. For the relevant interaction situations, communication goals are defined that describe what exactly the robot is supposed to express in the given situation. For our example use case, we started with the different phases of a quiz game: In the preparation part, the interaction is initiated. The robot explains the rules and, after receiving a confirmation of the user, starts the game. What follows is a number of consecutive quiz rounds. During each round, the robot asks a number of questions. The user provides her answer and receives feedback from the robot whether it is correct. In addition, the robot motivates the user to keep up their good performance. The use case is terminated with the closing part during which the game is summarized, and–with the consent of the user–ended by the robot. The three phases of the quiz game have been chosen as an example to demonstrate the design process. Of course, the three steps of the Analysis can be carried out with a different number and more diverse compilation of use cases to identify communication goals that are generalizable across various contexts.

3.2.1. Decomposing Use Cases

As a first step, the method of Essential Use Cases (EUC) [37] is used to analyze the different parts of the quiz game and identify recurring interaction steps of the robot. EUC break the interaction course down into user actions and robot actions and notes them down in an abstract way that is meaningful but does not propose a concrete design solution. The three parts of the quiz game were analyzed using EUC and noted down as flow charts (Figure 2). The crucial part for generating the robot behavior patterns is the robot actions. Thus, for demonstration purpose, actions of the user are limited to a minimum number in the presented flow charts.

3.2.2. Identifying Recurring Robot Interaction Steps

The flow charts provide a very detailed overview of each robot action as a combination of smaller sub-actions. When comparing the robot actions and sub-actions depicted in the three flow charts, it becomes obvious that most of them occur repeatedly throughout the course of the quiz game. These recurring interaction steps of the robot are marked and noted down in a list (Table 2). They are the starting point for the pattern generation.

3.2.3. Specifying Communication Goals

The list of recurring interaction steps is further specified by associating each robot action with a communication goal (Table 2). The communication goal describes which message the robot behavior should communicate to the user. To make it more vivid and emphasize the effect on the user, the communication goal should be phrased from the perspective of the robot (e.g., “I am currently processing the information you provided.”).

3.3. Phase 2: Creation

The Creation phase provides guidance on how to generate a concrete behavioral pattern for a communication goal. The pattern generation is based on existing design knowledge, which is assembled and documented in a structured and inspiring way, and then translated into specific design recommendations for the robot behavior in an ideation workshop. The three steps of the Creation phase need to be carried out for each communication goal individually. For demonstration purposes, we illustrate the three steps based on the examples of the communication goal of the robot action “listen to user” and its sub-actions:
  • “I am recording spoken information from you.” (action: listen to user);
  • “My full attention is on you.” (sub-action: direct attention to user);
  • “I am listening to what you are currently saying.” (sub-action: record user input).

3.3.1. Collecting Relevant Insights

Our pattern approach suggests a bottom-up pattern generation. This means that the patterns are not generated through pattern mining from existing HRI design solutions. Insights into how to let a robot express a certain communication goal can be drawn from research findings in psychology, social science, HTI, HRI and other relevant domains, combined with best practices from daily interactions with humans, animals or technology. As a first step towards creating a pattern, a desktop research has to be conducted to acquire all the relevant findings and best practices. The desktop research is based on the six steps of a literature review by Galvan and Galvan [38].
The results of the desktop research are meant to stimulate the step of idea generation for the patterns. For this purpose, they need to be documented and communicated in a way that is intuitive and graspable for the whole design team. To this end, we developed a workshop tool called Insight Board. An Insight Board displays the relevant insights from the desktop research for one communication goal as a collection of expressive pictures, sketches and text. Insight Boards are very similar to Mood Boards [39] or Vision Boards [40], which are frequently used in design processes as tools for visual communication with the goal to convey vivid pictures of relevant pieces of information, allowing the design team to obtain an intuitive feeling about the design task. Insight Boards differ from other visual inspiration boards insofar that they do not just provide a collection of loosely related inspirations but document “hard facts” gathered through the desktop research in an intuitive, visual way.
Figure 3 presents the Insight Board for the communication goal “I am listening to what you are currently saying”. The Insight Board shows examples for how the communication goal is addressed in human–human interaction (purple area), human–animal interaction (blue area) and human–technology interaction (grey area). It combines insights for the sub-actions “recording user input” (upper areas) and “direct attention to user” (lower areas).

3.3.2. Ideating Design Solutions

The Insight Boards are used to support the ideation of design solutions for a communication goal. To this end, an ideation workshop is conducted with two to four members of the design team. The ideation workshop makes use of Design Thinking methods (Story-Share and Capture [41] and brainstorming), takes approximately 30 min and consists of four parts:
  • Presentation of Insight Board: The team member who conducted the desktop research presents the Insight Board, the others follow the presentation and note down aspects that they find most relevant and inspiring through active listening. They also write down additional ideas that come to their mind.
  • Sharing and Clustering Insights: The aspects that were noted down are discussed by the design team and clustered.
  • Ideation: Based on the clusters, the design team brainstorms ideas for mapping the gathered insights to the behavior of robots in order to express the communication goal at hand.
  • Sharing and Clustering Ideas: The resulting ideas are, again, shared within the design team and organized in idea clusters. The idea clusters found the basis for the next step, the pattern specification.
For the three example communication goals, ideation workshops were conducted by two members of the design team based on the Insight Boards. Figure 4 presents the idea clusters for the robot action “listen to user” produced by the brainstorming. The idea cluster combines results from the ideation workshops for the sub-actions “record user input” and “direct attention to user”.
To sum up, the main insight drawn from the ideation workshop was that the robot actions “record user input” and “direct attention to user” refer to two different ways of expressing attention towards the user:
  • The communication goal “I am recording spoken information from you.” addresses the auditory attention expressed by the robot towards verbal user input, or—speaking in more technical terms—the voice recording by the robot’s microphone. In the course of the ideation workshop, the designers came to the conclusion that, depending on the capabilities and appearance of the robot, this auditory attention could be expressed through a dedicated dynamic body expression (turning the ears towards the user) or through a specific light signal that augments the robot’s ear or microphone.
  • The communication goal “My full attention is on you.” refers to visual attentiveness towards the user, as shown in social interaction by facing another person and maintaining eye contact. Similarly, to achieve this communication goal, the robot’s front and eyes can be turned towards the user. For robots without a torso or eyes, the communication goal could be expressed by pointing the camera towards the user.
  • The superior communication goal “I am listening to what you are currently saying.” combines design ideas for the other two communication goals into one behavioral expression.

3.3.3. Specifying Multimodal Behavioral Patterns

The results of the ideation workshop found the basis for the creation of concrete behavioral patterns for social robots. The pattern development is supported by the Modality Card Deck [42]. We developed this card deck to guide the design team through the task of designing multimodal patterns for the robot behavior to express the communication goal. The cards can be used to refine the ideas produced by the ideation workshop by selecting and elaborating on appropriate communication modalities as well as to document the design solutions in a precise way.
The Modality Card Deck contains cards for ten modality categories, which were derived from the body of research presented in Section 2.2. For each category, the card deck includes four different cards (for examples, see Figure 5):
  • Decision cards contain the name and a short description of the modality. Modalities are color-coded and labeled with an expressive icon so that they are easy to distinguish from one another. As a first step, the design team puts all ten decision cards on the table and chooses the categories that are required to realize the design ideas produced in the ideation workshop. For the selected categories, the three cards described below are placed on the table and their instruction are followed. All other cards can be put aside.
  • Investigation cards provide one or two questions that further specify the design space for a category, thus directing the attention of the designers towards specific aspects that are important when using this communication modality.
  • Parameter Cards provide a list of parameters that need to be specified in order to create precise and implementable behavioral patterns.
  • Idea Cards are empty cards that can be used to document the specifications for the selected modality and the communication goal at hand, based on the questions and parameters proposed by the two previous cards.
Using the Modality Card Deck, the design team selected the appropriate modalities for the three example communication goals to refine the ideas produced by the ideation workshop. Figure 5 shows the selected decision, investigation and parameter cards as well as the idea cards with the proposed design solutions for the robot action “listen to user”.

3.4. Phase 3: Evaluation

The last phase of the pattern design process is the Evaluation phase. It is aimed at investigating whether the developed behavioral patterns are experienced as comprehensible and pleasant by the users. To support the evaluation, the patterns need to be implemented as a prototype so that they can be shown to and experienced by the users. Using these prototypical implementation, user feedback can then be acquired by conducting a user study. Based on the study results, the patterns can be iteratively improved until they prove to provide a sufficient usability and user experience for the users. To demonstrate how the different steps are carried out in practice, we present our approach to implementing the pattern for the robot action “listen to user” on a specific robot (the MiRo robot, see Figure 6). We also describe how we evaluated the prototypical implementation of the pattern as part of a larger user study.

3.4.1. Prototyping Patterns

In order to be able to evaluate the designed patterns with users, they first need to be implemented on available robot platforms. We decided to implement patterns on the level of main actions (and not the level of sub-actions) for two reasons: First, patterns for sub-actions mainly serve the purpose of making different aspects of the robot’s behavior reusable in different expressions. Hence, they are always used in combination with other sub-patterns to form a pattern for a robot main action. Therefore, they will never be implemented in isolation and thus are difficult to evaluate on their own. Second, according to our experience with available robot platforms, the flexible combination of sub-actions and their parallel execution at runtime is not well supported. For example, it appears difficult to specify timing dependencies between them such as “end at same point in time”.
The implementation of the previously designed patterns on specific robots depends on their respective interaction capabilities and limitations. In many cases, an exploratory process will be required to find the best compromise between the behavioral design recommendations suggested by a pattern and the expressive capabilities of a robot. This process can be supported by interactive authoring tools such as MiRoCODE [43] for the animoid robot MiRo [44] or the Choregraphe Suite [45] for the humanoid robot Pepper [46]. These tools typically include 3D simulations of the respective robots that help to obtain a first impression of the pattern implementation. However, the implemented patterns should also be tested on physical robots as, for example, the detailed timing of movements may differ significantly between simulated and real-life demonstrations. Depending on the robot platform, implemented pattern prototypes can directly be deployed to the robot or converted into source code that uses the specific programming interface of the robot. Following our example, the prototypical implementation of the robot action “listen to user” for the MiRo robot is presented in Appendix C.
When evaluating patterns for robot actions in HRI, the influence of other aspects of the interaction in the evaluation results should be limited to a minimum. Particularly, technical shortcomings of the robot such as difficulties in recognizing user action or speech input should not become apparent for the user during the test, as they might confound the overall UX. In the case of the MiRo robot, we found that the built-in voice recognition was insufficient. We therefore implemented a Wizard-of-Oz prototype of the quiz game use case introduced in Section 3.1. Using an Abstract Application Interaction Model (AAIM [47,48]), the implemented patterns were orchestrated to form the quiz game’s interaction flow (as suggested by the EUC).

3.4.2. Gathering User Feedback in a User Study

The prototypical implementations of the patterns can be shown to users in order to evaluate their comprehensibility and pleasantness. This can be accomplished in different ways, using qualitative or quantitative methods. Hereinafter, we provide an example of how such a user study can be conducted. Our example pattern “listen to user” was evaluated as part of a larger user study in the context of the quiz game. For demonstration purposes, we only present the results of the evaluation of the example pattern. Twelve other patterns were evaluated in the same way.
The study was conducted as an online survey and consisted of three parts:
  • In the first part, participants were asked about their demographic data (age and gender) as well as their attitude towards and previous experiences with social robots.
  • In the second part, they were shown a 6-minute video of a user playing the quiz game with the MiRo robot. Throughout the course of the game, the robot displayed a number of different patterns to communicate its state and intentions to the user, including the pattern for “listen to user” (see Video S2 in the Supplementary Materials or refer to the Robot Behavior Pattern Wiki for Human–Robot Interaction described in Section 5.1). This video was included to provide a meaningful context for the patterns and immerse the participants in the experience of interacting with the robot. After the video, participants were asked to report their first impression of the robot using the Robot Attitude Scale [49].
  • The third part was the actual pattern evaluation. Participants watched short video snippets, each of which showed MiRo expressing one specific pattern. After each snippet, participants were asked to evaluate the comprehensibility and pleasantness of the pattern based on four questions:
    • In your opinion, what does the robot shown in this video snippet want to express?
    • Which behavior of the robot led to your opinion?
    • How comprehensible was the behavior of the robot to you? (scale from 0 = not at all to 3 = very), and
    • How pleasant was your experience of the behavior of the robot? (scale from 0 = not at all to 3 = very).
    After watching all 13 videos, the participants were thanked for their participation, and the survey was ended.
A total of 42 participants completed the online survey (21 female, 20 male, 1 other). Their age ranged from 24 to 90 years (M = 46.6, SD = 20.8). Thirteen participants were familiar with social robots from TV documentations. Seven of them had already encountered an assistant robot in real life, and one of them had even interacted with it. Five participants had a robot at home.
To analyze the answers to the two open questions, a qualitative content analysis was conducted by two members of the design team, based on the method proposed by Mayring [50]. For the first question inquiring about participants’ interpretation of the robot behavior, one team member derived the categories from the answers in an inductive way. After analyzing 40 percent of the answers, a formative reliability check was conducted and initial categories reviewed and refined. The analysis produced 31 categories, which were then used by a second team member for a second round of analysis. The results of the two rounds of content analysis were compared and consolidated in a follow-up workshop. To aggregate the results of the content analysis, we carried out a quantitative analysis per pattern based on their frequencies of occurrence. Moreover, the comprehension rate for each pattern was calculated by determining the percentage of participants whose answer included at least one category related to the intended communication goal. For the content analysis of the second open question, we used the ten modalities of the Modality Card Deck as deductive categories. Again, we determined the frequencies of occurrence of the categories for each pattern. The remaining questions were analyzed by calculating the means of the ratings provided for the comprehensibility and pleasantness.
For demonstration purpose, we hereafter present the results for the example pattern for “listen to user”.
Figure 7 shows the categories that were used to describe the robot behavior for the communication goal “I am listening to what you are currently saying.”, ordered by their frequency of occurrence. A total of 34 participants mentioned in their answers that the robot seemed to be listening to the user’s speech input. Eight participants also mentioned that the robot appeared to be focusing its attention on the user. The comprehension rate was calculated to be 100 percent, as each participant mentioned at least one of the categories that was intended by the pattern description produced in the Creation phase.
Figure 8 shows the modality categories for the aspects of the robot behavior that participants used to justify their interpretation. Participants who interpreted the behavior of the robot as an act of listening based their decision on the robot’s Dynamic Body Expression, mainly the movement of the auricles towards the user. Five participants described the robot’s upright posture as the reason for their interpretation. Two participants referred to the sound the robot made when moving its joints, and one mentioned the robot’s gaze.
The results indicate that, for the majority of the participants, the robot conveyed the communication goal as intended, whereas the sub-action of “record user input” was recognized better than the sub-action of “direct attention to user”.
Participants rated the comprehensibility and the pleasantness of the behavior as high (M = 2.76 SD = 0.48 and M = 2.31 SD = 0.67, respectively).

3.4.3. Improving Patterns

The last step examines how each behavioral pattern might be improved based on the feedback gathered in the user study.
The need for improvement can be determined based on the mean ratings for comprehensibility and pleasantness as well as the results of the content analysis. In the case of our example pattern for “listen to user”, the study results indicate that the pattern does not need revision. It passed the validity tests, which should be indicated by increasing its ranking in the documentation (as proposed by Alexander [29] and described in Section 4).
If a pattern receives low ratings, this might be due to three different reasons:
  • Methodology of the user test: How participants experience the comprehensibility and pleasantness of the robot is influenced by the way it is presented in the user test. Despite careful preparation, the interaction with the robot or general study procedure might turn out less smooth than intended, e.g., due to the testing environment or technical setup. Participants’ ratings are also influenced by their current mood or general attitude towards robots. Similarly, the perception of individual robot behaviors may depend on the use case or scenario. Thus, it should first be checked whether the study results reveal any hints of potential methodological limitations of the user test. If such shortcomings are revealed, the study results need to be interpreted with care, and the design team might want to consider repeating the user study with improved methodology.
  • Implementation on the robotic platform: The prototypical implementation on the robot is often only one way to realize the design recommendation proposed by the pattern description. Low ratings might indicate low comprehensibility and pleasantness for this concrete implementation, rather than for the pattern itself. The design team should therefore make sure that the implementation incorporates all relevant aspects proposed by the pattern and consider whether there might be other, more appropriate ways to realize the expression on this robot.
  • Behavioral expression: If neither the study methodology nor the implementation can be held accountable for low ratings, it can be assumed that the pattern itself needs revision. In this case, the design team needs to go back to the work they conducted in the Creation phase and revise it in light of the results of the user study. Improving the pattern description could mean either of the following options: making changes to the specifications of a single communication modality, altering the combination of modalities, or modifying the timing of the modalities. Additionally, the results of the user test might also reveal ideas for new patterns or variants of the same pattern. For example, there might be alternative ways to express the defined communication goal in the context of the specific use case. Given the iterative nature of the design process, revised patterns need to be tested again.

4. How to Document Behavioral Design Patterns in a Structured Way

In order to be reusable and address recurring design problems, patterns are commonly documented in a predefined format. A structured notation format benefits interaction design in two ways: It helps the creators of the patterns to document the developed design solution, and it provides guidance and all necessary information for the designers who want to make use of the patterns.
Patterns are commonly documented as text, structured into different paragraphs and accompanied by pictures and diagrams [51]. The paragraphs are also called pattern components. Each component presents a certain aspect of the pattern [32].
Our proposed pattern approach to social HRI design builds upon previous pattern approaches by combing pattern components from the HCI pattern approaches by Borchers and Tidwell [32,52] and in the Robot Operating System (ROS) Enhancement Proposals used to document reusable design solutions for robot programming [53]. Thereby, it is ensured that the pattern notation is both, scientifically accurate and applicable for practical robot design. To take into account domain specific requirements, some of the components were slightly modified and received different labels.
The pattern notation format consists of five main parts, each of which contains one to five components (see Table 3).
The first component is the Name of the pattern, which also serves as its unique identifier. The name should provide a reference to the core solution of the pattern.
The first part, the Preamble, contains relevant practical information about the pattern: The Pattern Type summarizes the function of the pattern as keywords, i.e., whether it refers to a main or sub-action of the robot. The Ranking describes how often the pattern has been validated with users in the Evaluation phase and thus how sure a designer can be that the pattern provides a solution that offers high usability and positive UX. The next two components, Version and Author, document the process of pattern creation and make it comprehensible.
The third and the fourth parts contain the crucial components of the patterns. The third part, Design Challenge, introduces the design problem that is addressed by the pattern. It contains the Communication Goal, as specified in the Analysis phase of the design process, as well as the Interaction Situation in which the pattern can be applied. The fourth part, Design Solution specifies the design recommendation that is proposed to express the communication goal in the given interaction situation. The essential information is provided by the component Solution that contains a description of the behavioral expression of the robot. Note that this description is very detailed for patterns that describe sub-actions, specifying the use and configurations of the different communication modalities as documented on the idea cards in the last step of the Creation phase. For main robot actions, the description is more high level. It references the more detailed descriptions of the sub-action patterns and contains information about how to orchestrate them. The Illustration provides a visualization of the solution, typically in the form of a picture or sketch. Additional visualizations are included as Examples. An example shows how the pattern can be implemented on a concrete robot and provides inspiration for how to use the pattern with your own robot. Further clarification of the reasoning behind the pattern solution, related literature and interaction mechanisms (i.e., the relevant results of the desktop research in the Creation phase) are summarized by the component Rationale.
The last part, References and Context, shows how the pattern is connected to other lower level (robot sub-action) patterns and higher level (robot main action) patterns, thus forming a pattern language (compare Section 5.1).
The pattern notations of the three example patterns can be found in Table A1, Table A2 and Table A3 in Appendix B.

5. How to Use Behavioral Design Patterns in Human–Robot Interaction Design

5.1. Making Patterns Accessible through a Pattern Language

When introducing the original pattern approach, Alexander stated: “No pattern is an isolated entity. Each pattern can exist in the world only to the extent that is supported by other patterns: the larger patterns in which it is embedded, the patterns of the same size that surround it, and the smaller patterns which are embedded in it” [29]. This means that a pattern approach is only truly powerful if it contains patterns of different levels that can be combined in order to create the desired design solution.
Pattern approaches, therefore, never only describe single patterns but also how they are connected. Such a network of patterns is also referred to as a pattern language. The present pattern approach to social HRI design bases the network of its pattern language on the concepts of robot main actions and sub-actions. As explained in Section 3.2.1, each interaction step of the robot can be regarded as the main action. Each main action has different aspects that shape the behavior of the robot, which we call sub-actions.
Section 3 provided an example of the robot main action “listen to user” and the sub-actions “record user input” and “direct attention to user”. It also showed how the pattern for “listen to user” was developed as a combination of the solutions proposed by the patterns of the two sub-actions. The same patterns for sub-actions could be used as building blocks for other higher order patterns. As depicted in the quiz round part of Figure 2, the pattern for “direct attention to user” can, for example, be used as a component of the main actions “ask quiz question”, “request answer”, “provide feedback to user” and “motivate good performance”.
Based on the quiz use case and related scenarios, we developed a pattern language of 40 patterns (16 for main actions, also called Composed Patterns, and 24 for sub-actions, also referred to as Atomic Patterns), following the three phases of the pattern design process.
The patterns have been made available through an online pattern wiki. The Robot Behavior Pattern Wiki for Human–Robot Interaction (https://pattern-wiki.iao.fraunhofer.de/ accessed on 16 August 2021) provides documentation of all 40 patterns and links them to each other. Thus, designers who want to use the patterns can easily switch between patterns. In order to make it easier to find the patterns, they have been structured into six categories describing different scenarios of use (Figure 9).

5.2. Tailoring HRI to Individual Users Using Pattern Variants

The perception of interactive systems in general and interaction behavior of social robots in particular depends on the characteristics of the specific user. Therefore, the personalization of HRI, i.e., tailoring the robot’s behavior to the individual currently interacting with the robot, has been proposed to promote the applicability and acceptance of robots in real-life settings [54,55].
The modular nature of the presented pattern approach provides the ideal structure to systematically design personalized HRI. To realize personalized HRI, different behavioral variants have to be designed. These variants can be designed as part of the Creation phase (see Section 3.3): Instead of designing just one pattern for each identified communication goal, the design team can develop different variants of the pattern to address different user groups. These pattern variants may, for example, use different modalities to express the communication goal for users with different sensory abilities or add different connotations to the main communication goal to match the user’s individual needs.
In the context of the quiz game example, the quiz master robot might show a different feedback behavior depending on the specific user’s characteristics. Community-oriented players might encounter a coach-like and empathic behavior, while more competition-focused users will be presented with a challenging and provocative quiz master [56].
Of course, meaningful pattern variants do not emerge out of thin air. Before creating the pattern variants, relevant user characteristics need to be gathered, ideally through user research activities. These user insights should then be used as additional input for the ideation workshop together with the insight boards.

6. Summary and Contribution

HRI designers are faced with the task of creating robots that are easy and pleasant to use. The UX of HRI could be increased by establishing a shared behavioral language for robots of different appearances and with different tasks. To achieve this goal, we suggest making use of the concept of design patterns. Design patterns have been proposed as a way to create reusable building blocks for HRI design [35]. Behavioral patterns are already an integral part of human social communication and defining such patterns for robots can be a step towards more intuitive and natural interactions with social robots in everyday life. In the present paper, we proposed our approach to applying the concept of design patterns to the design of non-verbal robot behaviors and demonstrated how it works based on an example use case. The approach can be followed by anyone who wishes to create behavioral expressions for social robots in such way that they are appearance-independent and usable in different use cases. It consists of the following three components:
  • A three-phase design process that provides step-by-step guidance for HRI designers to generate reusable, high quality behavioral patterns for social robots. To this end, it describes methods to identify recurring interaction situations, define communication goals for the robot, gather relevant insights from the existing body of research and transfer them into specific behavioral robot expressions.
  • A pattern documentation format that provides a clear, standardized structure to note down all relevant aspects of a pattern, so that others can understand its design recommendations and use it for their own robot and use cases. Thereby, the pattern documentation format promotes knowledge exchange between HRI designers and supports the development of a shared design language for robot behavior.
  • A pattern language of 40 behavioral patterns that can be used for HRI design. Future research and social robot applications can build upon this pattern language and extend it by applying the proposed design process and documentation format.
While our pattern approach can be regarded as a first step towards establishing design standards in HRI, it still has some limitations.
Thus far, we only focus on non-verbal behavior. The behavioral patterns can be combined with speech, but our approach does not contain any method to generate appropriate verbal output for the robot in a structured way. As many robots rely on speech to communicate with the user, an integration with approaches to HRI dialogue design might be beneficial. For example, abstract verbal structures, as introduced by Sauppé and Mutlu [35] or Petalson [34], could be employed to prototype the initial interaction flows based on the dialogue structure and then be enriched with our non-verbal behavioral patterns.
The design process as described above might appear extensive and time-consuming. However, it should be noted that the phases Analysis and Evaluation are always conducted for the whole use case (and not each individual pattern). That means that they usually only have to be carried out once, while the three steps of the Creation phase have to be repeated for each communication goal. In the Creation phase, the most time-consuming activity is the literature review and composition of the Insight Boards. Once the Insight Boards are created, a first draft for a pattern can be generated within 60 min, following the proposed workshop procedures. In summary, the expenditure for creating behavioral robot expressions with our pattern approach is comparable to those of applying a conventional HCD. In addition, the creation of future patterns can build upon and reuse the 26 Atomic Patterns already available in the pattern wiki, which reduces the personnel and time resources significantly.
The 16 Composed Patterns for robot main actions have already been implemented on a humanoid robot (Pepper), an animoid robot (MiRo) and an abstract robot (a robot vacuum). While these implementations suggest that, by principle, the patterns can indeed be applied to robots of different forms, they will have to be tested with other robotic platforms to prove that they are truly appearance-independent. From our experience with implementing the patterns, it already became obvious that they might only be implementable on robots that fulfill certain requirements. For example, for most patterns, a robot would need to be equipped with more than one communication modality. Similarly, while the patterns have been designed to match the respective communication goals in various contexts, for now, they have only been tested in the context of the quiz game. Therefore, we plan to test the patterns with different types of robots in different types of use cases in the future. In addition, we are working on methods that help software developers to implement the patterns on their specific robotic platform.

Supplementary Materials

The following are available online at https://0-www-mdpi-com.brum.beds.ac.uk/article/10.3390/mti5090049/s1, Video S1: Example for Pattern “listen to user” on a humanoid robot, Video S2: Example for pattern “listen to user” on an animoid robot, Video S3: Example for Pattern “speech recording” on a humanoid robot, Video S4: Example for pattern “speech recording” on an animoid robot, Video S5: Example for Pattern “attentive” on a humanoid robot.

Author Contributions

Conceptualization, K.P.; methodology, K.P.; user study, K.P. and D.Z.; writing, review and editing, K.P. and D.Z.; visualization: K.P.; project administration, K.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research was conducted as part of the NIKA project and funded by the German Federal Ministry of Education and Research (BMBF 16SV7941).

Institutional Review Board Statement

Ethical review and approval were not required for the study in accordance with local legislation and institutional requirements.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Acknowledgments

We thank all participants who took part in the user study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
AAIMAbstract Application Interaction Model
EUCessential use cases
HCDhuman-centered design
HRIhuman–robot interaction
HTIhuman–technology interaction
ROSRobot Operating System
UXuser experience

Appendix A

Figure A1. Insight board for the robot action “listen to user” and sources of the images used: 01 Bruno Thete—Unsplash, 02 Magda Ehlers—pexels.com, 03 Ben Ashby—Unsplash, 04 Anna Shvets—Unsplash, 05 Anna Dudkova—Unsplash, 06 Soundtra—Unsplash, 07 Kathrin Pollmann, 08 Apple Support, 09 Soundtrap—Unsplash, 11 Apple Developer, 12 Jan Antonin Kolar—Unsplash, 13 Kazden Cattapan—Unsplash, 14 Tokumeigakarinoaoshima—Wikimedia Commons, 15 Google Assistant, 16 Christina @ (wocintechchat.com, accessed on 16 August 2021)—Unsplash, 17 Andrea Piacquadio—Pexels, 18 Alireza Attari—Unsplash, 19 Andrea Piacquadio—Pexels, 20 geralt—Pixabay, 21 cottonbro—Pexels, 22 Paul Daley—Pexels, 23 Kirsten Bühne—Pexels, 24 Flickr—Pexels, 25 Kari Shea—Unsplash, 26 Baptist Standaert—Unsplash, 27 Tadeusz Lakota—Unsplash, 28 [57], 29 [57], 30 [57], 31 Kathrin Pollmann, 32 Kathrin Pollmann, 33 Kathrin Pollmann, 34 BTIHCEUOT Security Camera, 35 Luis Quintero—Pexels, 36 Andriyko Podilnyk—Unsplash, 37 Ali Hajian—Unsplash, 38 Trung Thanh—Unsplash, 40 Sebastian Herrmann—Unsplash, 41 Headway—Unsplash.
Figure A1. Insight board for the robot action “listen to user” and sources of the images used: 01 Bruno Thete—Unsplash, 02 Magda Ehlers—pexels.com, 03 Ben Ashby—Unsplash, 04 Anna Shvets—Unsplash, 05 Anna Dudkova—Unsplash, 06 Soundtra—Unsplash, 07 Kathrin Pollmann, 08 Apple Support, 09 Soundtrap—Unsplash, 11 Apple Developer, 12 Jan Antonin Kolar—Unsplash, 13 Kazden Cattapan—Unsplash, 14 Tokumeigakarinoaoshima—Wikimedia Commons, 15 Google Assistant, 16 Christina @ (wocintechchat.com, accessed on 16 August 2021)—Unsplash, 17 Andrea Piacquadio—Pexels, 18 Alireza Attari—Unsplash, 19 Andrea Piacquadio—Pexels, 20 geralt—Pixabay, 21 cottonbro—Pexels, 22 Paul Daley—Pexels, 23 Kirsten Bühne—Pexels, 24 Flickr—Pexels, 25 Kari Shea—Unsplash, 26 Baptist Standaert—Unsplash, 27 Tadeusz Lakota—Unsplash, 28 [57], 29 [57], 30 [57], 31 Kathrin Pollmann, 32 Kathrin Pollmann, 33 Kathrin Pollmann, 34 BTIHCEUOT Security Camera, 35 Luis Quintero—Pexels, 36 Andriyko Podilnyk—Unsplash, 37 Ali Hajian—Unsplash, 38 Trung Thanh—Unsplash, 40 Sebastian Herrmann—Unsplash, 41 Headway—Unsplash.
Mti 05 00049 g0a1

Appendix B

Table A1. Formal documentation of the pattern for the robot action “listen to user”.
Table A1. Formal documentation of the pattern for the robot action “listen to user”.
NameListen to User
TypeComposed Pattern
Ranking** (initially validated)
Version1
AuthorKathrin Pollmann
Interaction SituationThe user is telling the robot something. The robot provides feedback that it is aware of the user talking and that is recording the spoken information.
Communication Goal“I am listening to what you are currently saying.”
SolutionLet the robot express that:
  • It directs its attention towards the user using a user-oriented positioning in the room, letting the robot and user face each other. (Attentive)
  • It is currently recording spoken information from the user by highlighting the recording sensory organ (microphones or ears). (Speech recording)
  • Its operation mode is <on> setting the status light to permanently glowing. (Operation mode on)
Illustration Mti 05 00049 i001
Rationalesee Atomic Patterns Attentive, Speech recording and Operation Mode on
Examplessee Supplementary Material, Videos S1 and S2
References and ContextNeeds: Attentive AND Attentive, Speech recording AND Operation Mode on; Opposed patterns: all other composed patterns—two composed patterns can never be executed at the same time
Table A2. Formal documentation of the pattern for the robot action “record user input”.
Table A2. Formal documentation of the pattern for the robot action “record user input”.
NameSpeech Recording
TypeAtomic Pattern
Ranking** (initially validated)
Version1
AuthorKathrin Pollmann
Interaction SituationThe user has provided some speech input. The robot needs to indicate that it is perceiving and processing the input.
Communication Goal“I am recording spoken information from you.”
SolutionTo provide feedback that the user input is recorded, you should visualize the act of listening. To do so, you can highlight the recording “sensory” organ (microphone or ears as long as the user is talking. Depending on the robot, this can be achieved in different ways:
  • If the robot has movable ears, the auricles are turned towards the user.
  • If the robot has ears that are augmented with light, they are highlighted through a light signal. The light takes a bright, blue or green color and pulsates in a slow rhythm.
  • Your robot doesn’t have ears? Find some other creative way to visualize the recording of what the user is saying.
Illustration Mti 05 00049 i002
RationaleIn social human interaction, facing and looking at the opposite person is often interpreted as taking in what the person is saying. However, this behavior is, in fact, rather ambiguous, as we can never be quite sure whether someone is actually listening or, for example, day dreaming. Animals communicate by making the activity of listening more explicit, by putting their ears upright and/or turning them towards a (potential) auditory cue in the environment. This behavior can be copied by robots with movable ears. Analogies can also be found using other communication modalities. Technical devices often use light signals or animations to visualize speech recording (compare Amazon Echo, mobile phone audio recording apps, cameras). This design approach can also be transferred to robots. For example, Softbank robotics designed their Pepper robot with ears that can be illuminated by light. Light research suggests using cold colors such as blue to make the light signal more attention grabbing. Alternatively, a green light color could be used to emphasize that the speech recording functionality is active. Green is generally associated with a technical device being turned on or working.
References and further readings: [58,59,60]
Examplessee Supplementary Material, Videos S3 and S4
References and ContextNeeded by: Listening; Works well together with: Attentive
Table A3. Formal documentation of the pattern for the robot action “direct attention to user”.
Table A3. Formal documentation of the pattern for the robot action “direct attention to user”.
NameAttentive
TypeAtomic Pattern
Ranking* (initially validated)
Version2
AuthorKathrin Pollmann
Interaction SituationRobot and user are engaged in interaction. The robot communicates that its full attention is focused on the user. This pattern should be used in two types of situations:
  • The user is doing or saying something that is relevant to the interaction.
  • The robot is doing or saying something that concerns the user (not the context).
Communication Goal“My full attention is on you.”
SolutionAttentiveness towards the user can be expressed with a user-oriented positioning in the room. Position the robot in social distance from the user (1.2 m) and let the front of the robot face the user. It should always be clear where the frontal part of the robot is. If possible, use active user tracking to let the robot maintain eye contact with the user. Eye contact can also be achieved with robots that do not have eyes: in this case, the visual sensory organ, the camera, should be pointed at the user and (if possible) follow her head based on user tracking.
Illustration Mti 05 00049 i003
RationaleIn the interaction between different agents, attentiveness can be signaled through coming closer, decreasing the distance between oneself and the other agent. Research in human–robot interaction shows that a robot is perceived as attentive when its front is facing the user. From social interaction, we have also learned to interpret eye contact as a sign that someone’s attention is on us. Establishing eye contact is used as an unvoiced agreement to engage in social interaction with each other.
These two concepts can easily be transferred to robots, as demonstrated by two examples: In the movie “Robot and Frank”, the robot turns towards the user when talking to him. Pixar’s robot Wall-E moves its head closer to objects of interest.
References and further reading: [57,61,62,63,64]
Examplessee Supplementary Material, Video S5
References and ContextNeeded by: Active, Becoming active, Becoming inactive, Processing, Not understanding, Showing, Listening, Explaining, Encouraging good performance, Joyful positive feedback, Displeased positive feedback, Empathic negative feedback, Gloating negative feedback
Opposed patterns: Inside turn, Passively available

Appendix C

Prototypical implementation of the patterns active and listening using MiRo’s Python-based programming interface.
import miro2 as miro
class MiroPatterns:
    def __init__(self):
        # Init robot interface
        self.robot = miro.interface.PlatformInterface()
    def statuslight(self, pattern):
        [...]
    def active(self):
        # Stop any movements
        self.robot.set_forward_speed(0.0)
        self.robot.set_turn_speed(0.0)
        # Set permanent status light
        self.statuslight(’permanent’)
        # Head straight forward
        self.robot.set_neck(miro.constants.JOINT_LIFT, 35)
        self.robot.set_neck(miro.constants.JOINT_PITCH, -15)
        self.robot.set_neck(miro.constants.JOINT_YAW, 0)
        # Ears sidewards
        self.robot.set_joint(miro.constants.JOINT_EAR_L, 1.0)
        self.robot.set_joint(miro.constants.JOINT_EAR_R, 1.0)
        # Eyes open
        self.robot.set_joint(miro.constants.JOINT_EYE_L, 0.0)
        self.robot.set_joint(miro.constants.JOINT_EYE_R, 0.0)
        self.robot.sleep(0.1)
    def listening(self):
        self.active()
        # Ears forward
        self.robot.set_joint(miro.constants.JOINT_EAR_L, 0.0)
        self.robot.set_joint(miro.constants.JOINT_EAR_R, 0.0)
        self.robot.sleep(0.1)

References

  1. Forlizzi, J. How robotic products become social products: An ethnographic study of cleaning in the home. In Proceedings of the 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI), Arlington, VA, USA, 10–12 March 2007; pp. 129–136. [Google Scholar]
  2. Tanaka, F.; Ghosh, M. The implementation of care-receiving robot at an English learning school for children. In Proceedings of the 6th International Conference on Human-Robot Interaction (HRI), Lausanne, Switzerland, 6–9 March 2011; pp. 265–266. [Google Scholar]
  3. Wada, K.; Shibata, T. Living with seal robots—Its sociopsychological and physiological influences on the elderly at a care house. IEEE Trans. Robot. 2007, 23, 972–980. [Google Scholar] [CrossRef]
  4. Breazeal, C. Emotion and sociable humanoid robots. Int. J. Hum.-Comput. Stud. 2003, 59, 119–155. [Google Scholar] [CrossRef]
  5. Bartneck, C.; Forlizzi, J. A design-centred framework for social human-robot interaction. In Proceedings of the RO-MAN 2004, 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No. 04TH8759), Kurashiki, Japan, 22 September 2004; pp. 591–594. [Google Scholar]
  6. de Graaf, M.M.A.; Allouch, S.B. The influence of prior expectations of a robot’s lifelikeness on users’ intentions to treat a zoomorphic robot as a companion. Int. J. Soc. Robot. 2017, 9, 17–32. [Google Scholar] [CrossRef] [Green Version]
  7. Johnson, D.O.; Cuijpers, R.H.; Pollmann, K.; van de Ven, A.J. Exploring the entertainment value of playing games with a humanoid robot. Int. J. Soc. Robot. 2016, 8, 247–269. [Google Scholar] [CrossRef]
  8. Kahn, P.H.; Freier, N.G.; Kanda, T.; Ishiguro, H.; Ruckert, J.H.; Severson, R.L.; Kane, S.K. Design patterns for sociality in human-robot interaction. In Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction (HRI), Amsterdam, The Netherlands, 12–15 March 2008; pp. 97–104. [Google Scholar]
  9. Mutlu, B. Designing Gaze Behavior for Humanlike Robots. Doctoral Dissertation, University of Pittsburgh, Pittsburgh, PA, USA, 2009. Unpublished. [Google Scholar]
  10. Takayama, L.; Pantofaru, C. Influences on proxemic behaviors in human-robot interaction. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA, 10–15 October 2009; pp. 5495–5502. [Google Scholar]
  11. Mubin, O.; Stevens, C.J.; Shahid, S.; Al Mahmud, A.; Dong, J.J. A review of the applicability of robots in education. J. Technol. Educ. Learn. 2013, 1, 13. [Google Scholar] [CrossRef] [Green Version]
  12. Robins, B.; Dautenhahn, K.; Te Boekhorst, R.; Billard, A. Robotic assistants in therapy and education of children with autism: Can a small humanoid robot help encourage social interaction skills? Univers. Access Inf. Soc. 2005, 4, 105–120. [Google Scholar] [CrossRef]
  13. ISO 9241-210. Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems; Beuth: Berlin, Germany, 2019. [Google Scholar]
  14. Hassenzahl, M. User experience (UX) towards an experiential perspective on product quality. In Proceedings of the 20th Conference on l’Interaction Homme-Machine, Metz, France, 2–5 September 2008; pp. 11–15. [Google Scholar]
  15. Fronemann, N.; Peissner, M. User experience concept exploration. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, Helsinki, Finland, 26–30 October 2014; Roto, V., Ed.; ACM: New York, NY, USA, 2014; pp. 727–736. [Google Scholar] [CrossRef]
  16. Dautenhahn, K. Socially intelligent robots: Dimensions of human–robot interaction. Philos. Trans. R. Soc. Lond. Ser. Biol. Sci. 2007, 362, 679–704. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. van Breemen, A.; Yan, X.; Meerbeek, B. iCat: An animated user-interface robot with personality. In Proceedings of the Fourth International Joint Conference on Autonomous Agents and Multiagent Systems, Utrecht, The Netherlands, 25–29 July 2005; pp. 143–144. [Google Scholar]
  18. Andrist, S.; Tan, X.Z.; Gleicher, M.; Mutlu, B. Conversational gaze aversion for humanlike robots. In Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction (HRI), Bielefeld, Germany, 3–6 March 2014; pp. 25–32. [Google Scholar]
  19. Mutlu, B.; Shiwa, T.; Kanda, T.; Ishiguro, H.; Hagita, N. Footing in human-robot conversations: How robots might shape participant roles using gaze cues. In Proceedings of the 4th ACM/IEEE International Conference on Human Robot Interaction (HRI), La Jolla, CA, USA, 9–13 March 2009; pp. 61–68. [Google Scholar]
  20. Häring, M.; Bee, N.; André, E. Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In Proceedings of the 20th IEEE International Symposium on Robot and Human Interactive Communication (2011 RO-MAN), Atlanta, GA, USA, 31 July–3 August 2011; IEEE Press: Atlanta, GA, USA, 2011. [Google Scholar]
  21. Löffler, D.; Schmidt, N.; Tscharn, R. Multimodal expression of artificial emotion in social robots using color, motion and sound. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 334–343. [Google Scholar]
  22. Walters, M.L.; Dautenhahn, K.; Te Boekhorst, R.; Koay, K.L.; Syrdal, D.S.; Nehaniv, C.L. An empirical framework for human-robot proxemics. In Proceedings of the New Frontiers in Human-Robot Interaction: Symposium at the AISB09 Convention, Edinburgh, UK, 8–9 April 2009; pp. 144–149. [Google Scholar]
  23. Barakova, E.I.; Lourens, T. Expressing and interpreting emotional movements in social games with robots. Pers. Ubiquitous Comput. 2010, 14, 457–467. [Google Scholar] [CrossRef] [Green Version]
  24. Kennedy, J.; Baxter, P.; Belpaeme, T. The robot who tried too hard: Social behaviour of a robot tutor can negatively affect child learning. In Proceedings of the 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Portland, OR, USA, 2–5 March 2015; pp. 67–74. [Google Scholar]
  25. Pennisi, P.; Tonacci, A.; Tartarisco, G.; Billeci, L.; Ruta, L.; Gangemi, S.; Pioggia, G. Autism and social robotics: A systematic review. Autism Res. 2016, 9, 165–183. [Google Scholar] [CrossRef] [PubMed]
  26. Robins, B.; Dautenhahn, K.; Dickerson, P. From isolation to communication: A case study evaluation of robot assisted play for children with autism with a minimally expressive humanoid robot. In Proceedings of the 2009 Second International Conferences on Advances in Computer-Human Interactions, Washington, DC, USA, 1–7 February 2009; pp. 205–211. [Google Scholar]
  27. Breazeal, C.; Brooks, A.; Chilongo, D.; Gray, J.; Hoffman, G.; Kidd, C.; Lee, H.; Lieberman, J.; Lockerd, A. Working collaboratively with humanoid robots. In Proceedings of the 4th IEEE/RAS International Conference on Humanoid Robots, Santa Monica, CA, USA, 10–12 November 2004; Volume 1, pp. 253–272. [Google Scholar]
  28. Hoffman, G.; Breazeal, C. Collaboration in human-robot teams. In Proceedings of the AIAA 1st Intelligent Systems Technical Conference, Chicago, IL, USA, 20–22 September 2004; p. 6434. [Google Scholar]
  29. Alexander, C. A Pattern Language: Towns, Buildings, Construction; Oxford University Press: Oxford, UK, 1977. [Google Scholar]
  30. Gamma, E.; Helm, R.; Johnson, R.; Vlissides, J. Design Patterns: Elements of Reusable Object-Oriented Software; Addison-Wesley: Reading, MA, USA, 1995; p. 1995. [Google Scholar]
  31. Tidwell, J. Designing Interfaces: Patterns for Effective Interaction Design, 2nd ed.; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2010. [Google Scholar]
  32. Borchers, J.O. A pattern approach to interaction design. In Proceedings of the 3rd International Conference on Designing Interactive Systems: Processes, Practices, Methods, and Techniques, New York, NY, USA, 17–19 August 2000; ACM: New York, NY, USA, 2000; pp. 369–378. [Google Scholar]
  33. Peltason, J.; Wrede, B. Pamini: A framework for assembling mixed-initiative human-robot interaction from generic interaction patterns. In Proceedings of the 11th Annual Meeting of the Special Interest Group on Discourse and Dialogue, Tokyo, Japan, 24–25 September 2010; pp. 229–232. [Google Scholar]
  34. Petalson, J. Modeling Human-Robot-Interaction Based on Generic Interaction Patterns. Ph.D. Thesis, Bielefeld University, Bielefeld, Germany, 2013. [Google Scholar]
  35. Sauppé, A.; Mutlu, B. Design Patterns for Exploring and Prototyping Human-Robot Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI’14, Toronto, ON, Canada, 26 April–1 May 2014; Association for Computing Machinery: New York, NY, USA, 2014; pp. 1439–1448. [Google Scholar] [CrossRef]
  36. Pollmann, K. Behavioral Design Patterns for Social, Assistive Robots-Insights from the NIKA Research Project. In Mensch und Computer 2019-Workshopband; Gesellschaft für Informatik e.V.: Bonn, Germany, 2019. [Google Scholar] [CrossRef]
  37. Constantine, L.L.; Lockwood, L.A.D. Software for Use: A Practical Guide to the Models and Methods of Usage-Centered Design; Addison-Wesley Professional: Englewood Cliffs, NJ, USA, 1999. [Google Scholar]
  38. Galvan, J.L.; Galvan, M.C. Writing Literature Reviews: A Guide for Students of the Social and Behavioral Sciences; Taylor & Francis: London, UK, 2017. [Google Scholar]
  39. Lucero, A. Framing, aligning, paradoxing, abstracting, and directing: How design mood boards work. In Proceedings of the Designing Interactive Systems Conference (DIS ’12), Newcastle Upon Tyne, UK, 11–15 June 2012; pp. 438–447. [Google Scholar]
  40. Canfield, J. How to Create an Empowering Vision Board. 2017. Available online: https://www.jackcanfield.com/blog/how-to-create-an-empowering-vision-book/ (accessed on 16 August 2021).
  41. Doorley, S.; Holcomb, S.; Klebahn, P.; Segovia, K.; Utley, J. Design Thinking Bootleg; Stanford University: Stanford, CA, USA, 2018. [Google Scholar]
  42. Pollmann, K. The Modality Card Deck: Co-Creating Multi-Modal Behavioral Expressions for Social Robots with Older Adults. Multimodal Technol. Interact. 2021, 5, 33. [Google Scholar] [CrossRef]
  43. Consequential Robotics. MiRoCODE—Miro-E. Available online: https://www.miro-e.com/mirocode (accessed on 16 August 2021).
  44. Collins, E.C.; Prescott, T.J.; Mitchinson, B.; Conran, S. MIRO: A versatile biomimetic edutainment robot. In Proceedings of the 12th International Conference on Advances in Computer Entertainment Technology-ACE ‘15, Iskandar, Malaysia, 16–19 November 2015; ACM Press: New York, NY, USA, 2015; pp. 28:1–28:4. [Google Scholar] [CrossRef]
  45. SoftBank Robotics. Choregraphe Suite—Aldebaran 2.5.11.14a Documentation. Available online: http://doc.aldebaran.com/2-5/software/choregraphe/index.html (accessed on 16 August 2021).
  46. SoftBank Robotics. Pepper the Humanoid and Programmable Robot|SoftBank Robotics. Available online: https://www.softbankrobotics.com/emea/en/pepper (accessed on 16 August 2021).
  47. Peissner, M.; Häbe, D.; Janssen, D.; Sellner, T. MyUI: Generating accessible user interfaces from multimodal design patterns. In Proceedings of the 4th ACM SIGCHI Symposium on Engineering Interactive Computing Systems-EICS ’12, Copenhagen, Denmark, 25–26 June 2012; ACM Press: New York, NY, USA, 2012. [Google Scholar] [CrossRef]
  48. Ziegler, D.; Peissner, M. Modelling of Polymorphic User Interfaces at the Appropriate Level of Abstraction. In Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2018; pp. 45–56. [Google Scholar] [CrossRef]
  49. Broadbent, E.; Stafford, R.; MacDonald, B. Acceptance of Healthcare Robots for the Older Population: Review and Future Directions. Int. J. Soc. Robot. 2009, 1, 319–330. [Google Scholar] [CrossRef]
  50. Mayring, P. Qualitative Inhaltsanalyse. In Handbuch Qualitative Forschung in der Psychologie; VS Verlag für Sozialwissenschaften: Wiesbaden, Germany, 2010; pp. 601–613. [Google Scholar]
  51. Dearden, A.; Finlay, J. Pattern Languages in HCI: A Critical Review. Hum.–Comput. Interact. 2006, 21, 49–102. [Google Scholar] [CrossRef]
  52. Tidwell, J. Common Ground: A Pattern Language for Human-Computer Interface Design. 1999. Available online: http://www.mit.edu/~jtidwell/interaction_patterns.html (accessed on 16 August 2021).
  53. Conley, K. REP Purpose and Guidelines. 2010. Available online: https://www.ros.org/reps/rep-0001.html (accessed on 16 August 2021).
  54. Dautenhahn, K. Robots we like to live with?!—A developmental perspective on a personalized, life-long robot companion. In Proceedings of the RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759), Kurashiki, Japan, 20–22 September 2004; pp. 17–22. [Google Scholar] [CrossRef] [Green Version]
  55. Syrdal, D.S.; Lee Koay, K.; Walters, M.L.; Dautenhahn, K. A personalized robot companion?—The role of individual differences on spatial preferences in HRI scenarios. In Proceedings of the RO-MAN 2007—The 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju Island, Korea, 26–29 August 2007; pp. 1143–1148. [Google Scholar] [CrossRef] [Green Version]
  56. Pollmann, K.; Ziegler, D. Personal Quizmaster: A Pattern Approach to Personalized Interaction Experiences with the MiRo Robot. In Proceedings of the Conference on Mensch und Computer (MuC ’20), Magdeburg, Germany, 6–9 September 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 485–489. [Google Scholar] [CrossRef]
  57. Mizoguchi, H.; Takagi, K.; Hatamura, Y.; Nakao, M.; Sato, T. Behavioral expression by an expressive mobile robot-expressing vividness, mental distance, and attention. In Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robot and Systems. Innovative Robotics for Real-World Applications (IROS ’97), Grenoble, France, 11 September 1997; Volume 1, pp. 306–311. [Google Scholar]
  58. Fronemann, N.; Pollmann, K.; Loh, W. Should my Robot Know What’s Best for me? Human–Robot Interaction between User Experience and Ethical Design. AI Soc. 2021. [Google Scholar] [CrossRef]
  59. Baraka, K.; Veloso, M.M. Mobile Service Robot State Revealing Through Expressive Lights: Formalism, Design, and Evaluation. Int. J. Soc. Robot. 2018, 10, 65–92. [Google Scholar] [CrossRef]
  60. Choi, Y.; Kim, J.; Pan, P.; Jeung, J. The Considerable Elements of the Emotion Expression Using Lights in Apparel Types. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology (Mobility ’07), Singapore, 10–12 September 2007; Association for Computing Machinery: New York, NY, USA, 2007; pp. 662–666. [Google Scholar] [CrossRef]
  61. Goffman, E. Behavior in Public Places: Notes on the Social Organization of Gatherings; The Free Press: New York, NY, USA, 1963. [Google Scholar]
  62. Jan, D.; Traum, D.R. Dynamic Movement and Positioning of Embodied Agents in Multiparty Conversations. In Proceedings of the 6th International Joint Conference on Autonomous Agents and Multiagent Systems (AAMAS ’07), Honolulu, HI, USA, 14–18 May 2007; Association for Computing Machinery: New York, NY, USA, 2007. [Google Scholar] [CrossRef]
  63. Cassell, J.; Bickmore, T.; Billinghurst, M.; Campbell, L.; Chang, K.; Vilhjálmsson, H.; Yan, H. Embodiment in Conversational Interfaces: Rea. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’99), Pittsburgh, PA, USA, 15–20 May 1999; Association for Computing Machinery: New York, NY, USA, 1999; pp. 520–527. [Google Scholar] [CrossRef]
  64. Cassell, J.; Thorisson, K.R. The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Appl. Artif. Intell. 1999, 13, 519–538. [Google Scholar] [CrossRef]
Figure 1. The three phases and nine steps of the pattern design process.
Figure 1. The three phases and nine steps of the pattern design process.
Mti 05 00049 g001
Figure 2. Flow chart depicting the EUCs for the preparation part, one round of the quiz game and the closing part.
Figure 2. Flow chart depicting the EUCs for the preparation part, one round of the quiz game and the closing part.
Mti 05 00049 g002
Figure 3. Insight Board for the robot action “listen to user”. Sources of the images used for the insight board can be found in Appendix A Figure A1.
Figure 3. Insight Board for the robot action “listen to user”. Sources of the images used for the insight board can be found in Appendix A Figure A1.
Mti 05 00049 g003
Figure 4. Idea clusters produced by the ideation workshop for the robot action “listen to user”.
Figure 4. Idea clusters produced by the ideation workshop for the robot action “listen to user”.
Mti 05 00049 g004
Figure 5. Decision, investigation, parameter and idea cards for the robot action “listen to user”.
Figure 5. Decision, investigation, parameter and idea cards for the robot action “listen to user”.
Mti 05 00049 g005
Figure 6. The animoid robot MiRo in the quiz scenario accompanied by a tablet. This setup has been used in the user study for pattern evaluation.
Figure 6. The animoid robot MiRo in the quiz scenario accompanied by a tablet. This setup has been used in the user study for pattern evaluation.
Mti 05 00049 g006
Figure 7. Overview of the frequencies of occurrence for the categories that were chosen to describe the behavior displayed by the robot for the action “listen to user”.
Figure 7. Overview of the frequencies of occurrence for the categories that were chosen to describe the behavior displayed by the robot for the action “listen to user”.
Mti 05 00049 g007
Figure 8. Overview of the frequencies of occurrence for the modality categories mentioned by the participants to describe the behavior displayed by the robot for the action “listen to user”.
Figure 8. Overview of the frequencies of occurrence for the modality categories mentioned by the participants to describe the behavior displayed by the robot for the action “listen to user”.
Mti 05 00049 g008
Figure 9. Screenshot of the pattern categories provided in the Robot Behavior Pattern Wiki.
Figure 9. Screenshot of the pattern categories provided in the Robot Behavior Pattern Wiki.
Mti 05 00049 g009
Table 1. Overview of the actuators social robots typically feature and related modalities they can use to communicate with the user.
Table 1. Overview of the actuators social robots typically feature and related modalities they can use to communicate with the user.
ActuatorsCommunication Modalities
Multidirectional joints in neck, torso, arms, legs, earsWhole body motions, posture
Multidirectional wheelsProxemics, movements within the room
Movable legsProxemics, movements within the room
LEDsLight signal
SpeakersSound, speech
Manipulable eyesFacial Expressions, gaze behavior
Table 2. Overview of the recurring robot main and sub-actions and the associated communication goals.
Table 2. Overview of the recurring robot main and sub-actions and the associated communication goals.
Main Actions of RobotCommunication Goals
explain quiz game“I am explaining something to you. Stay focused on me!”
provide information“I am showing you information. Please pay attention to me and to this information.”
listen to user“I am listening to what you are currently saying.”
process confirmation“I am processing what I just learned from you. This will take some time—I will tell you when I am ready.”
load quiz game“I am loading the game and preparing to play it with you. I will let you know when I am ready.”
motivate good performance“I believe in you and I’ll support you to show good performance in the upcoming action!”
direct attention to user“My full attention is on you.”
demand user’s attention“You need to focus your attention on this. It is important.”
demonstrate own readiness“I can start acting straight away.”
demand user input“I am expecting you to provide your input now.”
record user input“I am recording spoken information from you.”
turn attention away from user“My attention is not on you. I am currently focused on internal processing or nothing at all.”
indicate progress of processing“I am currently processing or loading information.”
confirm successful data processing“I have successfully processed or loaded information.”
indicate correctness“What you did/said is correct.”
indicate incorrectness“What you did/said is incorrect.”
energize user“I am strong and so are you! Let’s go!”
Table 3. Overview of the pattern components and their contents (based on [36]).
Table 3. Overview of the pattern components and their contents (based on [36]).
ComponentFunction
NameShort, to the point description of the core of the solution
Preamble
TypeProvides keywords referring to the type and application domain of the pattern
RankingIndication of how valid the pattern is, ranging from * (tentative) over ** (initially validated) to *** (well validated)
VersionVersion number of the pattern
AuthorAuthors’ names and e-mail addresses
Design Challenge
Interaction SituationDescribes the general recurring interaction situation in which the patter occurs including the expectations and needs in this situation from the user’s point of view
Communication GoalDescribes the design problem at hand, focusing on the communication goal, i.e., the message that the robot behavior should communicate to the human interaction partner
Design Solution
SolutionText description of the concrete behavioral expressions that can be realized on a social robot in order to solve the design challenge; phrased as an instruction for the designer
IllustrationVisual representation of the solution
RationaleReasoning behind the proposed behavioral expression: inspirational examples and scientific references
ExamplesConcrete examples of how the pattern can be used, visualized as videos or story boards
References and ContextConnections with other patterns
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Pollmann, K.; Ziegler, D. A Pattern Approach to Comprehensible and Pleasant Human–Robot Interaction. Multimodal Technol. Interact. 2021, 5, 49. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5090049

AMA Style

Pollmann K, Ziegler D. A Pattern Approach to Comprehensible and Pleasant Human–Robot Interaction. Multimodal Technologies and Interaction. 2021; 5(9):49. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5090049

Chicago/Turabian Style

Pollmann, Kathrin, and Daniel Ziegler. 2021. "A Pattern Approach to Comprehensible and Pleasant Human–Robot Interaction" Multimodal Technologies and Interaction 5, no. 9: 49. https://0-doi-org.brum.beds.ac.uk/10.3390/mti5090049

Article Metrics

Back to TopTop