Next Article in Journal
Plasticizer Di-(2-ethylhexyl) Phthalate and Its Metabolite Mono(2-ethylhexyl) Phthalate Inhibit Myogenesis in Differentiating Mouse and Human Skeletal Muscle Cell Models
Next Article in Special Issue
Player Engagement Analysis of a Business Simulation Game from Physiological, Psychological and Behavioral Perspectives: A Case Study
Previous Article in Journal
Setting Temperature and Humidity with a Misting System in a Pilot Greenhouse at Cisauk-Tangerang, Indonesia
Previous Article in Special Issue
Towards High Usability in Gamified Systems: A Systematic Review of Key Concepts and Approaches
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Human Response to Humanoid Robot That Responds to Social Touch

Graduate School of Engineering, University of Fukui, 3-9-1, Bunkyo, Fukui-shi 910-8507, Fukui, Japan
*
Author to whom correspondence should be addressed.
Submission received: 25 July 2022 / Revised: 2 September 2022 / Accepted: 6 September 2022 / Published: 14 September 2022
(This article belongs to the Special Issue Human‑Computer Interaction: Designing for All)

Abstract

:
Communication robots have been introduced in nursing care, education, and the hospitality sector. In the future, robots will be increasingly integrated into human society, with more opportunities to interact closely with humans. Therefore, investigating the symbiosis between humans and robots is critical. Touch, including actions, such as shaking hands, holding hands, and touching shoulders are common in most societies. These actions are called the social touch and are common modes of communication. Social touch not only conveys emotions and intentions but also mental and physical effects. Touch considerably influences social relationships: for example, by creating positive impressions and enabling the fulfillment of requests. Since the development of communication robots and other robots capable of physical contact, touch communication between humans and robots has been extensively studied. Although studies have revealed that touching a robot positively influences the impression regarding the robot and induces a relaxed feeling, negative perceptions related to trust on the robot have been reported. Thus, touch interactions between humans and robots are yet to be fully understood. Studies have focused on the effects of touch, such as touching the robot or being touched by the robot. Although interactions with robots that respond to touch, such as hugging behavior, have been studied, few studies have examined the psychological effects of robot responses to other types of touch such as hitting, stroking, and grasping. In this study, a humanoid robot was used to investigate how the reactive behavior exhibited by the robot in response to touch by a participant affects the degree of favorability and intellectual impression toward the robot as well as the sense of accomplishment regarding communication. Participants exhibited high favorability, feeling of relief, and willingness to continue the interaction with robots that exhibited appropriate reactions to the touch of participants. Participants exhibited a positive impression when they decided the touch gesture of the robot rather than when instructed on how to touch it. The results of this study can provide guidelines for improving the design and utilization of robots, such as therapeutic robots, that work alongside humans.

1. Introduction

Robots have been used in educational and medical fields, stores, and even private spaces. Pepper (https://www.softbankrobotics.com/emea/en/pepper, accessed on 5 September 2022), a communication robot manufactured by Softbank Robotics, is a humanoid robot that estimates people’s emotions and communicates with them accordingly. The robot has numerous applications that can be customized satisfy a user’s requirements. Such robots can be used in several applications, such as reception work in hospitals and stores, and they can lead the exercise and rehabilitation tasks at welfare facilities. A seal-like robot PARO (https://www.ndsoft.jp/product/medical/paro/, accessed on 5 September 2022) was developed in 1993 for therapeutic purposes as an alternative to live pets and is used in places, such as hospitals and welfare facilities, in which interaction with real animals is difficult. Therapy using PARO is as effective as animal therapy, and interaction with PARO improves the mood of dementia patients and pediatric patients; furthermore, such patients can communicate with other patients through PARO [1,2]. AIBO (https://aibo.sony.jp, accessed on 5 September 2022), developed by Sony, is a dog-like pet robot that looks like a small dog. Twenty-two degrees of freedom allow the robot to move its ears and tail like a real dog. The robot has touch sensors on its head and back, which enable the robot to respond to human touch. This encourages users to touch the robot. Therefore, the robot has been used as a pet in welfare facilities. PALRO (https://palro.jp/en/, accessed on 5 September 2022), a small humanoid robot measuring approximately 40 cm in length, is equipped with acceleration, gyro, touch, and pressure sensors, and its head and limbs are fabricated using actuators. PALRO achieves high conversational ability by accumulating and learning conversational data. The robot performs recreation and reception duties as well as converses with users at a welfare facility for elderly people. Robots designed to communicate with humans are playing an active role in society. Such robots will be increasingly introduced in the future, and people will become familiarized with robots. To ensure smooth communication between humans and robots, the appearance and behavior of robots and the effects of human–robot interaction on humans should be investigated.
Various modalities of emotional expression by robots, such as changes in facial expression [3,4,5], upper body poses [6], and color changes [7] exist. These expressions are effective in communicating emotions and exhibit considerable potential in enhancing their communication with people. Humanoid robots can convey emotions in a manner similar to humans by mimicking human emotional expressions, such as facial expressions, arm movements, and posture. Researchers have attempted to communicate emotions using on-board LEDs. Plutchik [8] proposed a wheel of emotions to illustrate various emotions. In this wheel, colors on the wheel were arbitrarily assigned to different emotions. Terada et al. [7] experimented with a robot whose head was shaped like a ball illuminated in various colors and showed that it could convey emotions to participants as in the case of Plutchik’s model. Using the human ability to associate specific emotions from colors, the robot expressed emotions through the dynamic emission patterns of LEDs mounted in its eyes. Researchers have expressed the robot’s emotions by combining multiple communication modalities [9].
Numerous studies have been conducted to study the effects of robot behavior on humans. In an ultimatum game between a human and a robot, Terada and Takeuchi [10] suggested that the emotional expression conveyed through simple line drawings on the face of the robot affected the altruistic behavior of participants. De Melo et al. [11,12] conducted an experiment using an agent that imitated a human face displayed on a computer screen and reported that the agent’s emotional expressions influenced human decision making. Studies have revealed that the robot’s emotional expressions influence the impression of the robot. Jimenez et al. [13] reported that providing learning support using a robot with emotional expressions resulted in a positive impression regarding the robot. Kawahara et al. [14] and Takahashi et al. [9] proposed an emotion recognition model for robots, which revealed that emotional expressions by the robot through voice and body movements may induce altruistic behavior in participants.
Animals, including humans, engage in touch gestures as a means of communication. Shaking hands, holding hands, and touching shoulders are common behaviors and called social touch. Even with touch as the only means of communication, numerous emotions can be communicated [15,16]. Physical touch is particularly likely to be used for emotions that reinforce intimacy [17]. Various experiments have revealed that touching skin has positive physical and emotional effects. In experiments with baby monkeys, when given a cloth-covered doll and a doll made of metal wires, the baby monkeys preferred and clung to the cloth-covered doll, and the presence of this doll calmed the baby monkeys [18]. Family cuddling decreases blood pressure, even in stressful situations, and increases oxytocin levels [19]. Physical contact influences social relationships. When a server touches a customer in a restaurant, the tip given by the customer increases. This phenomenon is called the “Midas touch” after a king of Phrygia in the Greek myth who turned anything he touched into gold [20].
Studies analyzing the differences and similarities between human-to-human communication and human-to-robot communication have revealed that the personal space required by humans for their first meeting with a humanoid robot may be smaller than that required for their first interaction with another human [21]. The effects of physical contact with robots on humans were studied, and the results revealed that social touch with robots may have the same effect as social touch performed between humans. Hugging a robot positively influences the impressions of a robot [22]. Touch interaction with a robot in a stressful situation attenuates an increase in the heart rate [23]. Hand-holding experiments conducted on young children and a robot made them feel that the robot is human-like [24]. A touch from a robot in an ultimatum game revealed touch could potentially alleviate unfair actions toward robots [25]. In an interaction experiment using a large teddy bear-shaped robot, Moffuly, the interaction time increased considerably and the amount of money raised increased when the robot hugged the participant back [26]. Block et al. [22,27] developed a life-size robot that hugs people. They revealed that the physical characteristics of the robot, timing and intensity of the hug, as well as the robot’s intra-hug gestures, are critical. Although some studies have reported that physical contact with the robot can result in mental comfort, improved impression regarding the robot, and promotion of social behavior, other studies have suggested that being touched by the robot has a negative effect on the evaluation of comfort and familiarity [28].
Studies have focused on the effects of touch, such as touching the robot or being touched by the robot. Although interaction with robots that respond to touch, for example, through hugging behavior, has been studied, few studies have examined the psychological effects of the robot’s responses to other touches such as tapping, stroking, and grasping on the robot. In this study, a humanoid robot was used to investigate the effect of robot’s reactive behavior to human contact on the impression of the robot and mood of the person interacting with it.
First, the robot’s reactions to the participant’s touch were designed. The authors designed an appropriate reaction, a reaction opposite to the appropriate reaction, and no reaction. The participants touched the robot as instructed by the experimenter, and the participants experienced the robot’s reactions. The participants’ impressions of the robot were verified.
The participants also designed their touch method to convey their emotions, and the robot responded to their touch. The participants’ impressions of the robot were verified. We analyzed the location of the participant’s touch to the emotion. Participants tended to exhibit a high favorability, feeling of relief, and willingness to continue the interaction with robots that responded with appropriate reactions to their touch. This phenomenon revealed that participants had a more positive impression when they determined how to touch the robot than when they were instructed regarding the same. The results of this study can provide guidance on the improved design and utilization of robots, such as therapeutic robots, that work alongside humans.

2. Method

2.1. Experimental Environment

We investigated the effects of reactive behaviors exhibited by a small humanoid robot in response to human touch. The robot used in this study, NAO by Softbank Robotics, is equipped with touch sensors at limited positions (head, lower arms, and toes). Research has revealed that humans often touch the robot’s arms when communicating their emotions to the robot [29]. However, achieving sufficient interaction using the originally equipped touch sensors fixed on a few positions on the robot is difficult. Therefore, we constructed a Wizard of Oz (WOZ) system. In this system, an operator controlled the robot’s behavior behind a curtain and gave the appearance that the robot was responding autonomously to human physical touch. Figure 1 details the experimental environment. The experimenter and participant stood adjacent to each other, and the robot stood in front of the participant. The robot NAO is approximately 58 cm tall, weighs approximately 5.4 kg, and has joints with 25 degrees of freedom. The participants interacted with the robot according to the instructions of the experimenter. The room was partitioned with an accordion curtain, and the robot control PC and the operator were located behind the accordion curtain, which was out of sight of the participants. The operator operated the robot while watching the video camera images of the participants and the robot. The operator just followed the rules of each reaction of the robot against the participant’s social touch as described in Section 2.3. Thus, the operator cannot influence the operation of the robot by watching how the participants behave. The participants interacted with the robot and answered a questionnaire after each session.

2.2. Participants

Forty-one participants (24 men and 17 women) were recruited from the University of Fukui, Japan. The participants were students between 19 and 24 years old, ethnically Japanese, and educated in Japan. Their average age was 21.4, and the standard deviation of their ages was approximately 1.64. They volunteered to participate in the experiment. Some students had prior experience with robot interaction, whereas others did not have considerable experience.

2.3. Interaction with Humanoid Robot through Designed Social Touch

The robot’s reaction behavior in response to touch by participants was designed to investigate its effect on the participants. To evaluate the psychological impact on the participants depending on whether the robot reacts or not and the manner of the robot’s reaction, the following three robot reactions were set:
  • appropriate reaction suitable action or speech in response to human touch;
  • contradictory reaction contradictory action or speech in response to human touch;
  • no reaction no reaction to human touch.
Four actions were set for the participants: “stroking the head”, “touching the hand”, “stroking the arm”, and “grabbing the arm strongly”. These actions were introduced because they have been observed in interactions with the NAO. The “stroking the head” behavior was introduced because NAO is small and has rounded body parts, resulting in the participants finding the robot adorable, and such an appearance may likely cause the participants to be less uncomfortable when stroking its head. Other actions were limited to arms and hands because people typically prefer to touch the arms of a humanoid robot [29]. For each social touch, we designed an appropriate reaction of the robot that we considered appropriate, which was followed by a contradictory reaction and no reaction as comparisons.
The speech function in NAO may be monotonous or convey insufficient emotional expression. We prepared speeches using synthetic speech software “VOICEVOX: ShikokuMetan (https://voicevox.hiroshiba.jp, accessed on 5 September 2022)” and played them through NAO’s built-in speaker. NAO’s “Autonomous Life” function was partially activated, which allowed it to operate autonomously and made it appear animate. NAO automatically performed movements such as swaying its body from side to side and following a person’s face with its eyes. To mimic eye blinking motions, eye LEDs were lit at random intervals.

2.3.1. Design of Appropriate Reactions

The guiding principle for designing appropriate reactions was that these robot responses were necessary for establishing an appropriate relationship with the participants. The appropriate reactions when NAO received a social touch from a participant were as follows:
When a participant performed a “stroke the head” action, the robot moved its head back and forth in a small motion, spread its arms, and turned the LEDs in its eyes yellow and said, “Ehehe, I’m so happy” in Japanese. “Ehehe” is an exclamation conveying embarrassment. When the participant performed the “touching the hand” action, the robot lifted the touched hand up to its chest level, vocalized “Uhuhu, it tickles me” in Japanese, and turned on the yellow LEDs in its eyes. “Uhuhu” is an exclamation indicating a small laugh. When the participant performed the “stroking the arm” action, the robot slightly raised both arms, vocalized “Ehe” in Japanese, and turned the eye LEDs yellow. “Ehe” is also an exclamation conveying embarrassment. When the participant made a “grabbing the arm strongly” action, the robot responded by attempting to shake the arm off and turned the LEDs in its eyes red while vocalizing “Ouch” in Japanese. The design of the LED color of the eyes was based on the study of Terada et al. [7] and Plutchik’s model [8]. Table 1 details the various touching actions made by the participants, robot motion, vocalization, and eye LED color. Figure 2 details the videos of appropriate reactions of NAO to human touch.

2.3.2. Design of Contradictory Reactions

In the previous subsection, we designed the appropriate responses of the robot; in addition, for comparison, we designed a contradictory response to the appropriate reaction as the “contradictory reaction”. When the participant made a “stroking the head” action, the robot turned its face down, vocalized “I wish you would stop” in Japanese, and turned on the pink LEDs in its eyes. When the participant made a “touching the hand” action, the robot pulled its hands back, uttered “ouch” in Japanese, and turned on the red LEDs in its eyes. When the participant made a “stroking the arm” action, the robot looked down, clasped its hands in front of its chest, vocalized “I’m scared” in Japanese, and turned on the dark green LEDs in its eyes. When the participant performed a “grabbing the arm strongly” action, the robot lightly spread both arms, vocalized “I’m ticklish” in Japanese, and turned on the yellow eye LEDs. The design of the LED emission color of the eyes was based on the study by Terada et al. [7] and Plutchik’s model [8]. Table 2 details the various touching actions of the participants, the robot response, vocalization, and color of the eye LEDs for “contradictory reaction”. Figure 3 displays a few frames captured from the videos of the contradictory reactions of NAO to human touch.

2.3.3. Design of No Reaction

We additionally designed “no reaction” for comparison. In response to touch by a participant, the robot did not exhibit a response action, did not speak, and the LED light pattern was normal. At this time, the NAO function “Autonomous Life” remained partially enabled so that the robot followed the participant’s face with its eyes.

2.4. Interaction with the Humanoid Robot through Nondesigned Social Touch

In the previous section, we presented an experimental procedure in which the experimenter instructed the participants to perform one of the four designed social touches, and the robot exhibited the reactive behavior in response. This experiment helped the participants understand how to perform social touch gesture on the robot. However, in a more general human–robot interaction scenario, the participants performed social touch of their own choice on the robot to convey their emotions. Therefore, instead of instructing the participants on the manner of touch, the experimenter requested them to convey a specific emotion through a social touch. The participants were free to perform a social touch of their choice, and the robot reacted to that touch. We evaluated the psychological effect on the participants when they touched NAO with the intention of conveying a specific emotion and the response of the robot to that touch.
The three emotions that the participants conveyed to the robot were “happy”, “sad”, and “angry”. When the participant conveyed the “happy” emotion to the robot through social touch, the robot raised its arms, said “I’m ticklish” in Japanese, and turned on the yellow LEDs in its eyes. When the participant conveyed the “sad” emotion to the robot through social touch, the robot opened its arms and said “Do you want to hug me?” in Japanese, and the LEDs in its eyes turned light pink. When the participant conveyed “anger” to the robot, the robot pulled its arms and said “What’s wrong?” in Japanese, and the LEDs in its eyes turned light blue. Table 3 details the emotions conveyed by the participant to the robot, the robot’s actions, vocalization, and color of eye LEDs. Figure 4 shows frames of the videos of reactions of NAO to human touch with specified emotions.

2.5. Procedure

First, the experimenter explained the purpose and procedure of the experiment to the participants; next, the participants signed a consent form. A participant then stood in front of the NAO placed on the desk. The robot greeted the participant by saying “Hello, I am NAO. Nice to meet you” in Japanese. The experimenter stood next to the participant and instructed on how to touch NAO; then, the participant received the instructions for the four types of touches under the categories of appropriate reaction, contradictory reaction, and no reaction.
Three sessions were conducted with NAO, in which an appropriate reaction/contradictory reaction/no reaction in each session was conducted, and the order of these sessions was randomly determined. In each session, the participant performed four patterns of touch under the direction of the experimenter: “stroking the head”, “touching the hand”, “stroking the arm”, and “grabbing the arm strongly”. The order of the touch patterns was randomly selected by the experimenter. The participants answered a questionnaire after each session. The questionnaire was common to all sessions. Finally, the participants answered a summary questionnaire.
Next, an experiment was conducted for the session “Interaction with humanoid robot through nondesigned social touch” described in Section 2.4. The experimenter requested the participant to convey one of the emotions “happy”, “sad”, or “angry” in random order. The participant performed a social touch of own choice on the robot to convey the emotion instructed by the experimenter, and the robot responded to it. The participant answered a questionnaire after each social touch and the robot’s subsequent response.

2.6. Analysis

2.6.1. Common Questionnaire after Each Session

Participant impressions regarding the humanoid robot were recorded after each session. Nine seven-point semantic differential scales were used:
  • Dislike (1)–Like (7)
  • Estranged (1)–Friendly (7);
  • Unkind (1)–Kind (7);
  • Awkward (1)–Natural (7);
  • Machine-like (1)–Human-like (7);
  • Non-self-aware (1)–Self-aware (7);
  • Unrefined movement (1)–Sophisticated movement (7);
  • Foolish (1)–Clever (7);
  • Unreliable (1)–Trustworthy (7).
The feelings of the participants were recorded after each session. Two seven-point semantic differential scales were used:
  • I felt anxious (1)–I felt relieved (7);
  • I felt restless (1)–I felt relaxed (7).
The participants also answered a questionnaire to clarify if their emotions were conveyed to the robot. A seven-point semantic differential scale was used:
  • Do not agree at all (1)–Completely agree (7).
The questionnaire also asked the participant if the robot understood the emotions. A seven-point semantic differential scale was used:
  • Do not agree at all (1)–Completely agree (7).
In addition, the questionnaire asked if the participant would like to see the robot again.
  • Do not agree at all (1)–Completely agree (7).
Finally, a free-text field was provided.
After the session “Interaction with humanoid robot through nondesigned social touch” described in Section 2.4, the participants answered the same questionnaires again.
The proposed questionnaire used seven-point semantic differential scales. A nonparametric statistical test is suitable because they are ordinal scales. The data were pairwise. Differences in treatments were detected across multiple test attempts. The Friedman test was used for the aforementioned reasons. Multiple comparisons using Scheffe’s method were used to determine which differences between levels are significant.

2.6.2. Survey Questionnaire after All Sessions with Interaction with the Humanoid Robot through Designed Social Touch

After the three sessions with interaction with NAO through designed social touch, the participants answered the following summary questionnaire. The experimenter asked the participants to rank from first to third the appropriate session for each of the following items.
  • I liked the robot.
  • The robot was friendly.
  • It was a natural interaction.
  • I was surprised.
  • I was nervous.
  • I want to touch the robot again.
  • I want to see the robot again.
Finally, the participants were asked to fill a free-text field.

2.6.3. Place and Manner of Touch

For each of the three specified emotions in the session “Interaction through nondesigned social touch”, the experimenter recorded the part of NAO touched by the participant and the type of touch. A video camera was located at the side of the NAO and participant, as displayed in Figure 1. The video camera’s frame rate and resolution are 30 fps and 1920 × 1080, respectively. The session was recorded by the video camera. After the session, the experimenter reviewed the recorded video and recorded the position and manner of participants’ touch. When one participant touched a certain part of the robot in a specific manner, it was counted as one touch, and the percentage of specific touches to a certain part was calculated for each emotion by sex.
A flowchart of the experiment and analysis is displayed in Figure A1.

2.7. Ethics

This experiment was reviewed and approved by the ethics committee for human participants, Department of Human and Artificial Intelligent Systems, Graduate School of Engineering, University of Fukui, No. H2020001.

3. Results

3.1. Questionnaire Responses after Each Session

A Friedman test was performed using the results of the responses to the common questionnaire. Scheffe’s paired comparisons were made between sessions in which the robot responded differently.
The total results of the questionnaire responses after each session are displayed in Figure 5, Figure 6, Figure 7 and Figure 8. Figure 5, Figure 6 and Figure 7 show the responses to the questions related to the degree of liking toward the robot, robot’s humanity and intelligence, and degree of relaxation of the participants, respectively. Figure 8 displays the responses to the questions related to the impression of interaction sessions.
Significant differences between “appropriate reaction” and “contradictory reaction” were observed for “dislike–like” (p < 0.01), “estranged–friendly” (p < 0.01), “unkind–kind” (p < 0.01), “awkward–natural” (p < 0.01), “unreliable–trustworthy” (p < 0.01), “I felt anxious–I felt relieved” (p < 0.01), “I felt restless–I felt relaxed” (p < 0.01), “emotions were conveyed or not” (p < 0.01), “robot understood my emotions” (p < 0.01), and “I want to see the robot again” (p < 0.01).
Significant differences between “contradictory reaction” and “no reaction” were observed for “machine-like–human-like” (p < 0.01), “non-self-aware–self-aware” (p < 0.01), “unrefined movement–sophisticated movement” (p < 0.01), and “foolish–clever” (p < 0.01).
Significant differences between “appropriate reaction–no reaction” were observed for all items (p < 0.01).
The results of the summary questionnaire conducted after the three sessions are displayed in Figure 9.
For the responses “I like the robot”, “The robot was friendly”, “It was a natural interaction”, “I want to touch the robot again”, and “I want to see the robot again”, many participants selected “appropriate reaction” in the first place, “contradictory reaction” in the second place, and “no reaction” in the third place. For the item “I was surprised”, many participants selected “contradictory reaction” in the first place, “appropriate reaction” in the second place, and “no reaction” in the third place. For the item “I was nervous”, the participants selected “no reaction” in the first place, “contradictory reaction” in the second place, and “appropriate reaction” in the third place.
The results of the common questionnaire presented after the session “Interaction with humanoid robot through nondesigned social touch experiment” are displayed in Figure 5, Figure 6, Figure 7 and Figure 8.
For all items, the robot received higher scores, that is, more positive evaluations than in the previous session, when it showed appropriate, contradictory, and no reactions. The significant differences from the appropriate reactions were for “unrefined movement–sophisticated movement” (p < 0.01), “foolish–clever” (p < 0.01), and “robot understood the emotions” (p < 0.01). The significant differences from the contradictory reactions were for all items except “without ego–with ego”. The significant differences from “no reaction” were for all items (p < 0.01).

3.2. Analysis of Touched Area and Manner of Touch

The results of the touched area and manner of touch in the session “Interaction through nondesigned touch” are presented in Table 4. Because of the mishandling of video data, the data could not be collected for one participant. Therefore, we analyzed the data for 40 participants.

4. Discussion

4.1. Results of Participant’s Responses to Questionnaire after Each Session of Designed Social Touch

First, Figure 5, Figure 6 and Figure 7 are used to describe the overall trend. For the designed social touch sessions, the robot’s overall scores in the questionnaire were higher for the appropriate reactions. Thus, the robot’s appropriate reaction to the social touch made it easier for the participants to be positively impressed by the robot and to relax. This result is consistent with the results of a previous study [22,27]. The contradictory reactions had the second highest scores throughout the survey, while the scores of the contradictory reactions competed with those of the appropriate reactions for some items. Although some scores of “no reaction” were slightly higher than that of the contradictory reactions for some items, the overall scores of “no reaction” were low.
The items that exhibited significant differences between appropriate and contradictory reactions were those that evaluated the degree of liking toward the robot (Figure 5) and degree of relaxation of the participants (Figure 7). The items that exhibited no significant difference were “machine-like–human-like”, “non-self-aware–self-aware”, “unrefined movement–sophisticated movement”, and “foolish–clever”, which were items that evaluated the robot’s humanity and intelligence (Figure 6). These results revealed that the participants’ perception of the robot’s animal qualities did not change considerably when the robot exhibited appropriate and contradictory reactions. The fact that the robot clearly asserted itself in response to the participant’s touch with a contradictory reaction such as “I wish you would stop” and “I’m scared” may have led to the perception that the robot is a self-aware entity.
The items that exhibited significant differences with p < 0.01 between the appropriate and contradictory reactions were those that evaluated the degree of liking toward the robot (Figure 5), degree of relaxation of the participants (Figure 7), and the item “I want to see the robot again” (Figure 8). This result could be attributed to the fact that NAO responded to three of the four touch patterns for the appropriate reactions in a pleasing manner. The participants also felt more comfortable with the response of the robot, and this led to their willingness to continue the interaction. Furthermore, the participants rated the robot significantly higher when it responded appropriately, indicating that their emotions were conveyed to the robot and the robot understood them (Figure 8).
In the experiment, the participants were given instructions on how to touch the robot, but specific intentions or emotions for the touch were not included in the instructions from the experimenter. Figure 8 reveals that the majority of the responses for both “emotions were conveyed to the robot” and “the robot understood the emotions” of the contradictory reactions were below the midpoint of the scale (4 points), indicating that the overall perception of the participants regarding the robot was that it did not understand the participants’ emotions. The participants expected some response from the robot even when they followed the instructions and performed the intended social touch.
Thus, the participants expect that the robot will respond positively if they make a positive social touch such as stroking its head, touching its hand, or stroking its arm, and that the robot will respond negatively if they make a negative social touch such as holding its arm tightly as the appropriate reactions were designed. Conversely, the contradictory reaction is considered a disappointment to the participant’s expectations.
Significant differences were observed between contradictory and no reactions for items related to the robot’s humanity and intelligence (Figure 6), whereas no significant differences were observed between the appropriate and contradictory reactions. This result, which indicated that the robot’s display of reactive behavior has a strong influence on the participants’ perception of the robot’s humanity and intelligence, is consistent with the results of a previous study [30]. Significant differences of p < 0.01 were also observed for “my emotions were conveyed to NAO” and “NAO understood my emotions” (Figure 8) because the robot did not exhibit any reaction and the participants could not determine whether the robot understood their emotions or not.
A significant difference was observed between appropriate and no reactions for all items. “Appropriate reactions” was scored significantly higher for the intelligence-related items because of the nature of the robot’s response. It was considered to have increased likability and generated a feeling of relief toward the robot.

4.2. Summary Survey Questionnaire Results

From Figure 9, many participants selected “appropriate reaction” as the top reaction for the items “I liked the robot”, “The robot was friendly”, “It was a natural interaction”, “I want to touch the robot again”, and “I want to see the robot again”. This result was consistent with the tendency of the participants to achieve higher scores to the appropriate responses for the items related to likability, friendliness (Figure 5), and willingness to continue the interaction (Figure 8) in the common questionnaire. A participant stated in the free-text field of the questionnaire for the contradictory reactions that “I felt like a type of person with a lot of personal space, which made me feel familiar," suggesting that some participants might have found the contradictory reactions favorable.
“Contradictory reactions” was the most frequently chosen response for “I was surprised”. As in the common questionnaire, the contradictory responses of the robot tended to surprise people more than the appropriate reactions and no reactions.
In the “I was nervous” response, “no reaction” tended to be selected first. The no reaction of NAO may have prevented participants from feeling relaxed because they were unsure if their social touch was appropriate. Thus, the robot’s appropriate reaction might have eased the participants’ nervousness.

4.3. Analysis of Open-Ended Statements after Designed Social Touch Sessions

Regarding appropriate reactions, many participants, both men and women, wrote “friendly”, “human-like”, and “relieved”. The primary responses to contradictory reactions were “I got anxious”, “I felt sad”, and “I felt scared”. Some participants described feeling anxious or fearful when NAO responded in an unexpected manner, whereas other participants described feeling guilty or sad when NAO reacted in a manner that made them feel rejected. Regarding “no reaction”, many participants answered “I got anxious” or “I didn’t know what to do”. They described that they felt anxious because of the lack of response and did not know how to react. Several participants commented “The robot and I were eye-to-eye” because the face-following function of the robot was enabled by the experimenter.
These statements are consistent with the results of the common questionnaire, indicating that the designed reactive behavior was correctly perceived by the participants as intended. However, in the free-text response in the questionnaire for contradictory reactions, a participant responded, “I felt like a person with a large personal space, and I was familiar with the robot”. Furthermore, in the summary questionnaire, the participant also responded that the contradictory reaction was the most favorable and friendly. The contradictory reaction did not always create a negative impression.

4.4. Results of Questionnaire Responses after Interaction through Nondesigned Social Touch

Figure 5, Figure 6, Figure 7 and Figure 8 reveal that overall, the results of the questionnaire after “interaction through nondesigned social touch” exhibited higher scores than any of the reactions to “interaction through designed social touch”. Significant differences were observed for all items for contradictory reactions. All items were significantly different from each other for no reaction (p < 0.01). As in the case of “interaction through designed social touch”, NAO exhibited reasonable response behavior to the touch, which may have been evaluated more positively than contradictory reaction or no reaction. Notably, there were significant differences between the results of “interaction through designed social touch” and appropriate reaction for some items, with p < 0.05. In the appropriate reaction, the robot responded negatively when the participants performed the action “grabbing the arm strongly”, whereas it responded positively in all other cases. Not only the difference in terms of positive or negative response but also the verbalization of the response caused the difference in the evaluation.
Because no significant differences were observed between the results of “interaction through nondesigned social touch” and “appropriate reactions” for “dislike–like” and “estranged–friendly” (Figure 5), the difference in NAO’s response behavior to simple touch and emotional touch could not affect the likability or friendliness of the responses.
Significant differences were observed between the results of “interaction through nondesigned social touch” and the appropriate responses for the “foolish–clever” item (Figure 6). For “interaction through nondesigned social touch”, NAO did not merely express its impression of the social touch but, as mentioned above, it responded by asking appropriate questions to the participants. NAO’s words and actions enhanced its intellectual impression.
The scores for “unrefined movement–sophisticated movement” (Figure 6) were also significantly higher than those for interaction through designed social touch. The NAO’s motions for “stroking the head” and “touching the hand” under appropriate reactions were designed by the experimenter. All of NAO’s motions in “interaction through nondesigned social touch” used motions provided by NAO’s SDK. These motions appeared to occur more smoothly.
The scores for “interaction through nondesigned social touch” with regard to human-like and intellectual impressions (Figure 6) were significantly higher than in previous sessions. In “interaction through nondesigned social touch”, the touch had the intention of conveying a specific emotion. Rather than a purposeless touch or a purposeful touch that was not unified across the participants, performing a touch with the purpose of conveying an emotion and observing NAO’s response behavior was more strongly perceived by the participants as having accomplished its purpose; that is, the participants guessed that NAO correctly inferred the intent of the touch, and they were more likely to respond that “my emotions were conveyed to the robot” and “the robot understood my emotions” (Figure 8).

4.5. Analysis of Place and Manner of Touch

Table 4 details that there were differences between male and female participants in terms of the parts of the robot they touched and the manner in which they touched the robot. The head of the robot was the most frequently touched by men to convey the emotion “happy”. Furthermore, stroking the robot’s head was the most common method of communicating emotion. Women touched the robot’s hands most frequently, and they selected various touching methods such as stroking, grasping, and shaking; the head and arms were the next most frequently touched parts.
Both men and women expressed “sadness” by touching various parts of the body, including the head, shoulders, arms, and hands. Men often held the robot’s hand. They tried to convey their emotions to the robot with slow movements, such as stroking the head, stroking the shoulder, and touching the hand. Women also touched the robot in a calm manner by stroking its head, arm, or hand.
Men often expressed their anger with a strong force, such as grabbing the arms and hitting the head, whereas some participants preferred to perform mild actions such as patting the hands, arms, and shoulders. Men as well as women selected movements such as grasping the arm and hitting the head. Some participants used slow movements such as the stroking of hands. Only the female participants chose “poking” as a method of touching.
Most male participants touched the robot only once, and only one touched multiple parts. A greater number of female participants touched the robot multiple times more than the male participants, and four participants contacted various parts of the robot in one touch or they performed a single touch in various methods. Female participants also showed greater variation in touching the robot than male participants. Poking, grabbing the robot’s hand, and shaking the hand were behaviors exhibited only by female participants.
This result is similar to that of a previous study [29], which revealed that when emotions were to be communicated by simply touching NAO, the participants preferred to touch the hand or arm. This phenomenon is true for the expression of “anger”, which is conveyed by touching with a strong force, and “happy” and “sad” emotions conveyed by stroking. Furthermore, women were more likely to show “happy” emotions by shaking the NAO. However, in the present study, the head tended to be the most frequently selected site to be touched to convey the “happy” and “sad” emotions because the participants were instructed to “stroke the head” during the “interaction through designed touch” experiment, and they retained this action in their minds for touching the robot. Comparison with the results of the previous study [29] conducted in Sweden revealed that cultural differences between Japan and Sweden led to differences in the results of the emotional touch.

4.6. Consideration of Open-Ended Statement after Nondesigned Social Touch Session

Many participants described that their emotions were accurately conveyed to NAO or that NAO understood the conveyed emotions. Other participants commented “I felt the communication was very natural” and “the session was the most communicative”. Thus, the robot’s responses appropriately indicated that it understood the participants’ emotions, thus demonstrating that this experiment achieved a high level of emotional communication.

5. Conclusions

In this study, experiments on human–robot touch interaction were conducted to investigate the effects of the robot’s response behavior to human social touch on the robot. First, through “Interaction through designed social touch”, we investigated the psychological effects on the participants according to the presence or absence of the robot’s reaction and whether the robot’s reaction was appropriate or contradictory. Three types of robot response behaviors were set, namely appropriate, contradictory, and no reactions. The participants’ actions were set as “stroking the head”, “touching the hand”, “stroking the arm”, and “grabbing the arm strongly”, and the robot exhibited appropriate, contradictory, or no reaction to each touch. Participants rated their impressions of the appropriate responses of the robots and their own mental states in a questionnaire. The appropriate reactions of the robot tended to gain high likability and generated a feeling of relief and desire to continue the interaction. The contradictory response of the robot tended to be less favorable, and it made the participants feel anxious and upset. However, the scores of the items related to intellectual impression tended to be high, and no significant difference was observed in the participant response for the contradictory reactions and appropriate reactions. Moreover, no reaction was negatively evaluated for all items.
Next, we conducted an “Interaction through nondesigned social touch” session in which the participants were asked to touch the robot to a specific emotion. The robot was designed to respond to touches that conveyed an emotion. The robot responded in a sympathetic manner when the participant tried to convey the “happy” emotion, and queried them when they conveyed “sad” or “angry” emotions. Using the same questionnaire as in the previous experiment, the participants evaluated their impressions of the robot and their own mental state. By comparing the results of the questionnaires, the “Interaction through nondesigned social touch” session, in which the participants tried to convey an emotion through social touch, was found to be associated with a greater feeling of relief, more intelligent impression of the robot, greater willingness to continue interacting with the robot, and higher degree of communication achievement. The participants had a more positive impression of the robot when they performed a social touch of their own choice on the robot rather than when they were instructed on how to touch the robot.
Analysis of the part of the robot touched by the participants and the type of touch revealed that male participants stroked the robot’s head more often, whereas female participants stroked or held the robot’s hand more often when conveying the “happy” emotion. When conveying the “sad” emotion, both male and female participants touched the robot’s hand more often, and they mostly made slow movements, such as stroking, touching, or holding the robot’s hand. To convey anger, the participants made forceful movements, such as grabbing the arm or hand strongly. Male participants touched the robot only once, whereas some female participants touched the robot several times or touched various parts of the robot’s body to convey a particular emotion. Only female participants grabbed the robot’s arms and hands and swung them.
In the future, the robot’s reactive behavior should be autonomous, and the interactions that convey other emotion should be analyzed. The robot used in this study was not equipped with touch sensors over the entire body. Therefore, the robot could not respond autonomously when touched in areas not covered by sensors, and a person controlled the robot behind the scenes, making it appear to the participants as if the robot was responding autonomously. Because people touched various parts of the robot, the robot should be equipped with touch sensors all over its body to realize autonomous free-touch interaction between people and the robot. It is also necessary to investigate how to touch the robot to convey emotions other than the three emotions presented to the participants, such as “fun” and “fear”, and identify the responses that have psychological effects.
As we mentioned in Section 4.5, the participants in the last session of the interaction through nondesigned social touch could have retained the instructed actions in the previous sessions of interaction through designed social touch. The participants might believe that they had a limited variety of actions to the robot. The experiment design for the nondesigned social touch should be reconsidered to eliminate unwanted influences. The experiment of the interaction through nondesigned social touch can be conducted independently with other participants.
In this study, a small humanoid robot was used to investigate how the responses of the robot to the human social touch affect the impression of the robot on the participants. The appearance of the robot may considerably affect the impressions of the robot on people. Subsequent studies will be conducted on other types of robots.

Author Contributions

Conceptualization, M.O. and Y.T.; methodology, M.O. and Y.T.; software, M.O.; validation, M.O. and Y.T; formal analysis, M.O. and Y.T; investigation, M.O.; resources, Y.T. and S.T.; data curation, M.O. and Y.T; writing—original draft preparation, M.O.; writing—review and editing, Y.T. and S.T; visualization, M.O. and Y.T.; supervision, Y.T. and S.T.; project administration, Y.T.; funding acquisition, Y.T. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by JSPS KAKENHI Grant Number 20K12754.

Institutional Review Board Statement

This experiment was reviewed and approved by the ETHICS COMMITTEE FOR HUMAN PARTICIPANTS, Department of Human and Artificial Intelligent Systems, Graduate School of Engineering, University of Fukui (Approval No. H2020001).

Informed Consent Statement

Informed consent was obtained from all participants involved in the study.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Figure A1. Flowchart of the experiment and analysis.
Figure A1. Flowchart of the experiment and analysis.
Applsci 12 09193 g0a1

References

  1. Wada, K.; Shibata, T. Social and physiological influences of living with seal robots in an elderly care house for two months. Gerontechnology 2008, 7, 235. [Google Scholar] [CrossRef]
  2. Shibata, T. Importance of physical interaction between human and robot for therapy. In Proceedings of the Universal Access in Human-Computer Interaction: Applications and Services, 6th International Conference (UAHCI 2011), Orlando, FL, USA, 9–14 July 2011; Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2011; Volume 6768, pp. 437–447. [Google Scholar] [CrossRef]
  3. Bruce, A.; Nourbakhsh, I.; Simmons, R. The role of expressiveness and attention in human–robot interaction. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation, Washington, DC, USA, 11–15 May 2002; pp. 4138–4142. [Google Scholar] [CrossRef]
  4. Takano, E.; Chikaraishi, T.; Matsumoto, Y.; Nakamura, Y.; Ishiguro, H.; Sugamoto, K. Psychological effects on interpersonal communication by bystander android using motions based on human-like needs. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2009), St. Louis, MO, USA, 10–15 October 2009; pp. 3721–3726. [Google Scholar] [CrossRef]
  5. Kanoh, M.; Iwata, S.; Kato, S.; Itoh, H. Emotive Facial Expressions of Sensitivity Communication Robot “Ifbot”. Kansei Eng. Int. 2005, 5, 35–42. [Google Scholar] [CrossRef]
  6. Itoh, K.; Miwa, H.; Matsumoto, M.; Zecca, M.; Takanobu, H.; Roccella, S.; Carrozza, M.C.; Dario, P.; Takanishi, A. Various emotional expressions with emotion expression humanoid robot WE-4RII. In Proceedings of the 2004 1st IEEE Technical Exhibition Based Conference on Robotics and Automation, Proceedings (TExCRA 2004), Minato, Japan, 18–19 November 2004; pp. 35–36. [Google Scholar] [CrossRef]
  7. Terada, K.; Yamauchi, A.; Ito, A. Artificial emotion expression for a robot by dynamic color change. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; pp. 314–321. [Google Scholar] [CrossRef]
  8. Plutchik, R. Emotions and Life: Perspectives from Psychology, Biology, and Evolution; American Psychological Association: Washington, DC, USA, 2002. [Google Scholar]
  9. Takahashi, Y.; Kayukawa, Y.; Terada, K.; Inoue, H. Emotional Expressions of Real Humanoid Robots and Their Influence on Human Decision-Making in a Finite Iterated Prisoner’s Dilemma Game. Int. J. Soc. Robot. 2021, 13, 1777–1786. [Google Scholar] [CrossRef]
  10. Terada, K.; Takeuchi, C. Emotional expression in simple line drawings of a robot’s face leads to higher offers in the ultimatum game. Front. Psychol. 2017, 8, 1–9. [Google Scholar] [CrossRef] [PubMed]
  11. De Melo, C.M.; Carnevale, P.; Gratch, J. The influence of emotions in embodied agents on human decision-making. In Proceedings of the Intelligent Virtual Agents, 10th International Conference (IVA 2010), Philadelphia, PA, USA, 20–22 September 2010; Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2010; Volume 6356, pp. 357–370. [Google Scholar] [CrossRef]
  12. De Melo, C.M.; Carnevale, P.; Gratch, J. The effect of expression of anger and happiness in computer agents on negotiations with humans. In Proceedings of the 10th International Conference on Autonomous Agents and Multiagent Systems (AAMAS 2011), Taipei, Taiwan, 2–6 May 2011; Volume 3, pp. 937–944. [Google Scholar]
  13. Jimenez, F.; Yoshikawa, T.; Furuhashi, T.; Kanoh, M. An emotional expression model for educational-support robots. J. Artif. Intell. Soft Comput. Res. 2015, 5, 51–57. [Google Scholar] [CrossRef]
  14. Kawahara, M.; Sawada, Y.; Tanaka, A. Emotion Perception and Altruistic Behavior to Humanoid Robot. In Proceedings of the 35th Annual Meeting of the Japanese Cognitive Science Society, Ibaraki, Japan, 30 August–1 September 2018; pp. 847–856. (In Japanese). [Google Scholar]
  15. Hertenstein, M.J.; Holmes, R.; McCullough, M.; Keltner, D. The Communication of Emotion via Touch. Emotion 2009, 9, 566–573. [Google Scholar] [CrossRef] [PubMed]
  16. Hertenstein, M.J.; Keltner, D.; App, B.; Bulleit, B.A.; Jaskolka, A.R. Touch communicates distinct emotions. Emotion 2006, 6, 528–533. [Google Scholar] [CrossRef] [PubMed]
  17. App, B.; McIntosh, D.N.; Reed, C.L.; Hertenstein, M.J. Nonverbal Channel Use in Communication of Emotion: How May Depend on Why. Emotion 2011, 11, 603–617. [Google Scholar] [CrossRef] [PubMed]
  18. Harlow, H.F. The Nature of Love. Am. Psychol. 1958, 13, 673–685. [Google Scholar] [CrossRef]
  19. Light, K.C.; Grewen, K.M.; Amico, J.A. More frequent partner hugs and higher oxytocin levels are linked to lower blood pressure and heart rate in premenopausal women. Biol. Psychol. 2005, 69, 5–21. [Google Scholar] [CrossRef] [PubMed]
  20. Crusco, A.H.; Wetzel, C.G. The Midas Touch: The Effects of Interpersonal Touch on Restaurant Tipping. Personal. Soc. Psychol. Bull. 1984, 10, 512–517. [Google Scholar] [CrossRef]
  21. Watanabe, H.; Takahashi, M. Analysis of Interaction during Conversation between Nursery Department Students and Humanoid Robot Pepper. Bull. Educ. Found. Koike Gakuen 2021, 19, 23–39. (In Japanese) [Google Scholar]
  22. Block, A.E.; Kuchenbecker, K.J. Emotionally Supporting Humans Through Robot Hugs. In Proceedings of the ACM/IEEE International Conference on Human–Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 293–294. [Google Scholar] [CrossRef]
  23. Willemse, C.J.; van Erp, J.B. Social Touch in Human–Robot Interaction: Robot-Initiated Touches can Induce Positive Responses without Extensive Prior Bonding. Int. J. Soc. Robot. 2019, 11, 285–304. [Google Scholar] [CrossRef]
  24. Hieida, C.; Abe, K.; Nagai, T.; Omori, T. Walking Hand-in-Hand Helps Relationship Building Between Child and Robot. J. Robot. Mechatron. 2020, 32, 8–20. [Google Scholar] [CrossRef]
  25. Fukuda, H.; Shiomi, M.; Nakagawa, K.; Ueda, K. ’Midas touch’ in human–robot interaction: Evidence from event-related potentials during the ultimatum game. In Proceedings of the 7th Annual ACM/IEEE International Conference on Human–Robot Interaction (HRI’12), Boston, MA, USA, 5–8 March 2012; pp. 131–132. [Google Scholar] [CrossRef]
  26. Nakata, A.; Shiomi, M.; Kanbara, M.; Hagita, N. Does being hugged by a robot encourage prosocial behavior? In Proceedings of the ACM/IEEE International Conference on Human–Robot Interaction (HRI’17), Vienna, Austria, 6–9 March 2017; pp. 221–222. [Google Scholar] [CrossRef]
  27. Block, A.E.; Seifi, H.; Hilliges, O.; Gassert, R.; Kuchenbecker, K.J. In the Arms of a Robot: Designing Autonomous Hugging Robots with Intra-Hug Gestures. ACM Trans. Hum.-Robot Interact. 2022, 1–47. [Google Scholar] [CrossRef]
  28. Hirano, T.; Shiomi, M.; Iio, T.; Kimoto, M.; Tanev, I.; Shimohara, K.; Hagita, N. How Do Communication Cues Change Impressions of Human–Robot Touch Interaction? Int. J. Soc. Robot. 2018, 10, 21–31. [Google Scholar] [CrossRef]
  29. Andreasson, R.; Alenljung, B.; Billing, E.; Lowe, R. Affective Touch in Human–Robot Interaction: Conveying Emotion to the Nao Robot. Int. J. Soc. Robot. 2018, 10, 473–491. [Google Scholar] [CrossRef] [Green Version]
  30. Takayoshi, K.; Tanaka, T. The Relationships between the Behavior of KUWATA and the Imporession of His Intelligence/Personality; IEICE Technical Report HIP2007-64; The Institute of Electronics, Information and Communication Engineers: Tokyo, Japan, 2007. (In Japanese) [Google Scholar]
Figure 1. Experimental environment.
Figure 1. Experimental environment.
Applsci 12 09193 g001
Figure 2. Frames from videos of appropriate reactions of NAO to human touch. (a) Case: stroking the head; (b) Case: touching the hand; (c) Case: stroking the arm; (d) Case: grabbing the arm strongly.
Figure 2. Frames from videos of appropriate reactions of NAO to human touch. (a) Case: stroking the head; (b) Case: touching the hand; (c) Case: stroking the arm; (d) Case: grabbing the arm strongly.
Applsci 12 09193 g002
Figure 3. Frames from videos of contradictory reactions of NAO to human touch. (a) Case: stroking the head; (b) Case: touching the hand; (c) Case: stroking the arm; (d) Case: grabbing the arm strongly.
Figure 3. Frames from videos of contradictory reactions of NAO to human touch. (a) Case: stroking the head; (b) Case: touching the hand; (c) Case: stroking the arm; (d) Case: grabbing the arm strongly.
Applsci 12 09193 g003
Figure 4. Frames from videos of reactions of NAO to human touch with emotions. (a) Reaction to “Happy” emotion; (b) Reaction to “Sad” emotion; (c) Reaction to “Angry” emotion.
Figure 4. Frames from videos of reactions of NAO to human touch with emotions. (a) Reaction to “Happy” emotion; (b) Reaction to “Sad” emotion; (c) Reaction to “Angry” emotion.
Applsci 12 09193 g004
Figure 5. Results of questionnaire responses after each session (1/4): degree of liking toward the robot.
Figure 5. Results of questionnaire responses after each session (1/4): degree of liking toward the robot.
Applsci 12 09193 g005
Figure 6. Results of questionnaire responses after each session (2/4): robot’s humanity and intelligence.
Figure 6. Results of questionnaire responses after each session (2/4): robot’s humanity and intelligence.
Applsci 12 09193 g006
Figure 7. Results of questionnaire responses after each session (3/4): degree of relaxation of the participants.
Figure 7. Results of questionnaire responses after each session (3/4): degree of relaxation of the participants.
Applsci 12 09193 g007
Figure 8. Results of questionnaire responses after each session (4/4): impressions of the sessions.
Figure 8. Results of questionnaire responses after each session (4/4): impressions of the sessions.
Applsci 12 09193 g008
Figure 9. Results of summary questionnaire conducted after three sessions.
Figure 9. Results of summary questionnaire conducted after three sessions.
Applsci 12 09193 g009
Table 1. Appropriate reactions of NAO to human touch.
Table 1. Appropriate reactions of NAO to human touch.
Participant’s Touching ActionNAO ResponseUtteranceLED Color
stroking the headsmall back-and-forth movements of the head and spreading of arms“Ehehe, I’m so happy.”yellow
touching the handlifts the touched hand up to chest level“Uhuhu, it tickles me.”yellow
stroking the armslightly raises both arms“Ehe.”yellow
grabbing the arm stronglyattempts to shake the arm off“Ouch!”red
Table 2. Contradictory response of NAO to each touching action.
Table 2. Contradictory response of NAO to each touching action.
Participant’s Touching ActionNAO ResponseUtteranceLED Color
stroking the headturns its face down“I wish you would stop.”pink
touching the handpulls its hands back“Ouch!”red
stroking the armlooks down, clasps its hands in front of its chest“I’m scared.”dark green
grabbing the arm stronglylightly spreads both arms“I’m ticklish.”yellow
Table 3. Responses of NAO to emotions conveyed through participant’s social touch.
Table 3. Responses of NAO to emotions conveyed through participant’s social touch.
Emotion Conveyed by the ParticipantNAO ResponseUtteranceLED Color
Happyraises its arms“I’m ticklish”yellow
Sadopens its arms“Do you want to hug me?”light pink
Angrypulls its arms“What’s wrong?”light blue
Table 4. Analysis of participant’s touch.
Table 4. Analysis of participant’s touch.
Body PartTouch TypeEmotion
HappySadAnger
MaleFemaleMaleFemaleMaleFemale
headstroke0.420.170.120.150.050.00
touch0.040.000.030.050.000.00
grab0.000.000.000.050.020.00
hit0.000.000.000.000.070.14
armstroke0.000.130.120.050.050.00
touch0.000.000.030.150.000.00
grab0.080.000.000.000.240.18
hit0.040.030.000.000.070.07
poke0.000.000.000.000.000.04
swing0.000.000.000.000.000.03
handstroke0.000.270.120.150.050.14
touch0.000.030.030.150.000.00
grab0.270.270.210.150.140.14
hit0.000.000.000.000.070.03
swing0.000.090.000.000.000.00
shoulderstroke0.080.030.170.000.050.00
touch0.040.000.040.050.000.00
grab0.000.000.000.000.000.04
hit0.000.000.000.000.070.11
poke0.000.000.000.000.000.04
torsostroke0.040.030.120.000.080.00
touch0.000.000.030.050.000.00
hit0.000.000.000.000.070.00
facehold0.000.000.030.000.000.00
tiptoetouch0.000.000.000.050.000.00
hit0.000.000.000.000.000.03
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Okuda, M.; Takahashi, Y.; Tsuichihara, S. Human Response to Humanoid Robot That Responds to Social Touch. Appl. Sci. 2022, 12, 9193. https://0-doi-org.brum.beds.ac.uk/10.3390/app12189193

AMA Style

Okuda M, Takahashi Y, Tsuichihara S. Human Response to Humanoid Robot That Responds to Social Touch. Applied Sciences. 2022; 12(18):9193. https://0-doi-org.brum.beds.ac.uk/10.3390/app12189193

Chicago/Turabian Style

Okuda, Mariko, Yasutake Takahashi, and Satoki Tsuichihara. 2022. "Human Response to Humanoid Robot That Responds to Social Touch" Applied Sciences 12, no. 18: 9193. https://0-doi-org.brum.beds.ac.uk/10.3390/app12189193

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop