The current trend of the Human–Robot Interaction (HRI) field is to endow robotic platforms with social intelligence to create a more trustworthy and humanlike interaction, for example, by mimicking the human’s behavior [1
]. One step towards the development of such capability is to integrate into the robotic platform a complete understanding of the profile of the person it is interacting with [2
], which allows the robot to adapt its behavior based on the user preferences and needs. Advances in artificial intelligence, and machine learning (ML) mostly, aim at endowing robotic systems with powerful capabilities for interacting [3
]. For instance, the work in [4
] presents a personalized ML model with the aim of automatically detect the affective states and engagement of children involved in robot-assisted therapy. As reported in [3
], computation personalization of HRI based on deep learning is still an open challenge due to the real-word complexities and the no clearness of the aspects the robot should be able to perceive.
Among the fundamental aspects that the robot should take into account, the personality of the user should be included, since it has been considered as a vital factor in understanding the quality of HRI [6
]. As described in [7
], personality represents a coherent patterning of behaviors that could be used to explain how people interact with others in social settings. Similarly, how a person behaves during the interaction could be affected by temporal mental states which are reflected by emotions and different levels of engagement. Both temporal traits (i.e., emotion and engagement) may fluctuate during the interaction or change their temporal dynamics based on some salient events that may happen during the interaction [8
]. In HRI, several works [9
] focused on inferring the engagement of a person by analyzing the non-verbal cues (e.g., nodding, mutual and direct gaze, the timing of the response to stimuli) of the human beings interacting with the social robots. When the interaction occurs with the elderly population, additional aspects of the profile may be considered, such as possible cognitive and physical impairments.
Given the advances in the communication modality of recent robotic platforms, more often social robots are used in assistive contexts as support for the assessment and the cognitive stimulation of elderly people [12
]. Robots can be programmed to perform specific and repetitive actions with the advantage of performing them following the required standardization [16
]. Compared to technologies usually used in this context, such as tablets and Personal Computers [19
], socially assistive robots result more empathetic and challenging. This approach provides support to professional caregivers, allowing multiple screening and alleviating their workload while promoting sociality among the older population. Unlike previous works for cognitive assessment with robot technology [15
], this study exploits the usage of a non-humanlike robot, ASTRO. It is endowed with simple social behaviors, such as speech synthesis (verbal behavior) and led illumination (non-verbal behavior).
Similar to [20
], we would like to explore the role of the main factors which could affect the interaction of ASTRO with elderly people: Relatively permanent qualities (i.e., cognitive profile and personality) and temporal traits (i.e., emotional and engagements affects) of the interacting elderly users. Particularly, this feasibility study aims to investigate the role of each factor (i.e., cognitive profile, personality, emotional and engagement state) in the perception of the robot, its usability, and in the interaction itself. Our research questions could be summarized as follows:
Do the attitude of the users and the user’s perception of the technology change after the interaction with the robot? (RQ1).
Does the cognitive mental state influence the usability and the user’s perception of the robot in this scenario? (RQ2).
Do the personality traits influence the usability and user’s perception of the robot in this scenario? If yes, which one? (RQ3).
Does the current emotion influence the usability and user’s perception of the robot in this scenario? If yes, which one? (RQ4).
Which factors may influence the interaction in the assistive scenario? (RQ5).
Finally, this work discusses the results suggesting some guidelines for developing social assistive robots able to adapt and personalize their behavior based on the interacting participant in an assistive context.
2. Related Works
Socially assistive robotics (SAR) aims to assist human users by creating effective and close interactions to achieve measurable progress in convalescence and rehabilitation [21
]. In healthcare intervention, robotic solutions have been designed as a mechanism to support mental health activities ranging from screening to administrating on-demand therapy [22
]. In the latest years, several solutions adopted a robotic platform as support of cognitive therapy, assessment, and stimulation [23
]. As reported in Table 1
, the robotic system is usually programmed for administrating the overall test to older patients, under the supervision of a professional clinician. Human-like robots are mostly used (i.e., NAO, Pepper). Due to their appearance, human-like robots are thought to facilitate social interaction and communication, as they possess all the necessary features to convey social signals [17
]. As shown in Table 1
, the authors in [24
] adopted a no-humanlike robot (i.e., Giraff robot) for cognitive stimulation. Namely, the robot was not endowed with special social behaviors and it was not able to give a feedback on the elderly performance [24
]. Differently from [24
], the robot adopted in this study resembles the human body shape (i.e., heads with eyes, torso). Furthermore, it is endowed with some communicative skills which allow it to motivate the user and to exhibit some social behaviors, as will be described in Section 3.2
Related works of socially assistive robotics in cognitive assessment, stimulation, and therapy, tested with elderly users.
Related works of socially assistive robotics in cognitive assessment, stimulation, and therapy, tested with elderly users.
|Reference||Year||Robot||Aim||Robot Role||Human Role||Outcome|
|||2009||Bandit||Cognitive Therapy||Guiding the patient in performing the game||Supervision||Robot encouragement improves response time.|
|||2017||NAO||Cognitive Therapy||Guiding the patient in performing the game||Supervision||Robot acceptability increases after the interaction.|
|||2018||Pepper||Cognitive Assessment||Overall administration of the test||Supervision||Robot improved socialization.|
|||2019||Pepper||Cognitive Assessment||Overall administration of the test (1st time)||Overall administration of the test (2nd time)||Validation of the robotic assessment.|
|||2019||NAO||Cognitive Therapy||Overall administration of the test (1st time)||Overall administration of the test (2nd time)||Analysis of nonverbal behavior revealed there was more engagement with the robot, than with the clinician.|
|||2019||Giraff||Cognitive Stimulation||Overall administration of the test||Supervision||Older people accepted the guidance of a robot, feeling comfortable with it explaining and supervising the tests instead of a clinician.|
|||2020||NAO||Cognitive Stimulation||Overall administration of the test (1st time)||Overall administration of the test (2nd time)||Performing stimulation exercises with the robot enhanced the therapeutic effect of the exercise itself, reducing depression-related symptoms in some cases.|
|||2021||Pepper||Cognitive Therapy||Guiding the patient in performing the game||Supervision||Robot improved socialization.|
|This work||2021||ASTRO||Cognitive Assessment||Overall administration of the test||Supervision||Robot reduced anxiety and incentivized the interaction.|
The main success of the adoption of robotic solutions in this context relies on the enhanced engagement of the participants concerning the non-embodied computer screen [16
]. In previous works in the SAR field, the engagement is usually evaluated through questionnaires (i.e., User Engagement Scale questionnaire in [26
]), non-behavioral cues (i.e., mainly eye gaze direction [9
]), and physiological signals (i.e., EEG and galvanic skin response [14
]). Similarly, the authors of [17
] evaluated the emotional experiences of the users undergoing a cognitive stimulation activity with NAO. They investigated the two dimensions of emotional experience: Valence and arousal.
While the first one was assessed using the self-report Positive and Negative Affect Schedule (PANAS) questionnaire, the latter dimension was evaluated by analyzing the heart rate variability. Additionally, the authors evaluated offline the non-verbal behaviors (i.e., face-directed gaze, gaze aversion, hand movement, and posture) of the participants to assess their engagement level. The authors of [25
] investigated the emotional experience characterizing the interaction with the robot, by deriving the positive emotions from the number of times and the duration of smiling activity occurring during the experimentation. The same authors investigated also the usage of Facial Action Units (FACs) for detecting happiness, surprise, anger, fear, disgust, sadness, and neutrality in more recent work in the same context [18
]. This brief regression on the temporal traits (i.e., emotion and engagement) analyzed in previous works enforces our idea that they should be included as factors that may influence the quality of the interaction. In this work, we will investigate the older adults’ emotional and engagement response in terms of valence (i.e., by adopting the PANAS questionnaire) and by analyzing the duration and the frequency of no-verbal behaviors.
As described in [27
], emotions could be generated by personal affective disposition and relatively permanent affective personality traits. It is the reason behind the current trend of HRI in investigating the influence of personality on the quality of the interaction [6
]. Indeed, personality traits could be used to explain the way people respond to others in social settings. Among the several models of personality, the Big Five personality traits [28
] are the most widely used. They include extraversion, agreeableness, conscientiousness, neuroticism, and openness to experience. As highlighted by the review reported in [6
], traits of the personality influence the user’s behavior during the interaction. In the SAR field, the work described in [15
] represents the first attempt of investigating the influence of the user’s personality traits on psychometric assessments. Also, in this case, the traits of the Big Five model were used and the openness to experience aspect was the one that influenced the final cognitive score. Similarly, our work aims to investigate the role of the personality factors, focusing on their influence on the quality of the interaction more than on their effect at a cognitive level. Our ground hypothesis is that both personality and cognitive mental state may influence the quality of interaction. They are both aspects that the robots should take into account to modulate the interaction appropriately.
In a similar fashion to the works reported in Table 2
, this work will also analyze the usability and the perception of the technology, by collecting the feedback directly from the interacting participants. A series of self-report measures will be included to evaluate the state of anxiety perceived by the user, as well as to identify the perception of this technology by non-expert users.
Among the recruited participants, two users were excluded from the study due to a crying lag and an extremely delusional state in T1, respectively. As result, a total of 9 elderly persons (3 male, 6 female, avg. age 83.33 years old, range 72–92 years old) were enrolled for this study. Unfortunately, one participant refused to answer the questionnaires administrated in T1 and T2, while accepting to interact with the robot; another participant was not recorded by the camera, due to a technical issue. Based on the difficulty encountered, each user was excluded by the analysis involving the missed information. The Big Five personality traits of the recruited participants are summarized in Figure 3
By taking into account three levels of cognitive impairments (i.e., none, mild, severe), the pool of participants was composed of two users with severe cognitive impairments, two participants with no cognitive impairments, and the remaining participants with mild cognitive impairments.
4.1. Video Recording Analysis
On average, the time spent in each session was 11.21 min (std = 2.02 min). The average percentage of time spent interacting with the robot corresponded to 82.9% (std = 9.9%), which is more than the percentage of time spent interacting with the therapist (avg: 16.6%, std: 9.5%). Indeed, the average number of times the participants interacted with the therapist was 7.5 (std: 4.2). Mostly the interactions were characterized by coherent responses of the elderly (avg: 23.9, std: 7.3), and very few robot repetitions were recorded (avg: 5.7, std: 3.5). The analysis of the emotional attitude through no-verbal behaviors highlighted a general neutral emotional state (avg: 98.3%, std: 1.8%) characterized by a high percentage of inexpressiveness (83.6%). Despite the high percentage of inexpressiveness, the remaining facial expressions were also present at different extents: Smile (1.1%), laugh (0.5%), raised eyebrows (2.3%), and frown (12.5%). The same results were also confirmed by the body posture analysis. Overall, 97.6% of the interaction was conducted in a quiet position, on average, and in rare cases, the people nodded their head in a sense of approval (1.4%) or shook their head in terms of disapproval (0.8%). Additionally, the gaze analysis reported that 42.7% of the time the users looked directly at the robot’s head during the interactions. Table 3
shows the detailed results.
The correlation analysis registered a very high positive correlation between the robot interaction time and the GQ2_ANI domain (ρ = 0.93). Similarly, a very high positive correlation was detected between Nodding head behavior and the US2_ITU (ρ = 0.92), suggesting that the quantity of time and some non-verbal behaviors influence the perception of the robot and its usability. In the following subsections, we will investigate the role of the engagements’ parameters with the users’ personality, rated affect, and cognitive mental profile.
4.2. Descriptive Statistics
Tendentially, the P1 results highlight a high PA (avg: 31.33, std: 6.94) and a low NA (avg: 14.16, std: 3.6). Similarly, the answers to P2 are characterized by a high PA (avg: 37, std: 3.6) and a low NA (avg: 13.16, std: 4.02). The results underline a small increment of the PA values and a small decrement of the NA scores after the interaction with the robot (P2). It results in a general trend of increasing the PANAS average score (avg P1 = 45.5, avg P2 = 50.16), which is not statistically meaningful (p > 0.05).
Similarly, the comparison of the GQ1 and GQ2 answers does not show any significant differences, even if there is an evident increasing trend for each item and macrodomains. Namely, the average score of the GQ1 is 61.57 points (std: 9.55), while the average score of the GQ2 is 71.71 points (std:12.55). The descriptive statistic of each item is reported in Table 4
Overall, the participants had a good disposition about the robot before and after the interaction (US1_ITU, avg:10, std: 2; US2_ITU, avg:10.8, std: 4.9). The perception of the easiness of usage of the platform changed from lower (US1_PEU, avg. 3.2, std: 0.45) to higher values of agreement (US2_PEU, avg:4.4, std: 0.9). A less positive trend is registered on the feeling of enjoyment, which is characterized by middle values (US1_ENJ, avg:3.2, std: 1.48; US2_ENJ, avg: 3.8, std: 1.6). In general, most of the participants did not feel any anxiety during the interaction with ASTRO (US1_ANX, avg: 5, std: 1.2; US2_ANX, avg: 2.2, std: 0.4) and trusted the robotic service (US2_TRU, avg: 6.8, std: 1.64; US2_TRU, avg: 7.8, std: 2.2). Among all, the Mann–Whitney U Test highlights a significant difference in the ANX domain before and after the interaction with the robot (p
= 0.02). The descriptive statistic of each category is reported in Table 5
4.3. Correlation with Personality
To investigate the role of personality with the perception of the robot, we computed the Spearman correlation with the items of GQ1 and GQ2. The correlation of the BFI aspects and GQ1 returned a negative correlation between GQ1_Surprise and BFI_AGR (), and the GQ1_Friendly item and BFI_NEU (). Negative correlations were also found between GQ2_Responsible and BFI_NEU () and BFI_OP (). The BFI_OP was also negatively correlated with GQ2_Surprise ().
High negative correlations resulted by comparing BFI personality traits with the US1 and US2. In detail, the BFI_OP was negatively correlated with the ITU_Q3 (“I think my independence could improve with the robot”) of US1 () and with the ITU_Q2 (“I would be willing to use the cognitive service if it could help the family/caregiver’s work”) of US2 (). These results show that the creative and inventive participants tend to disclaim the usage of the robot for improving their independence.
The personality traits of the older participants resulted correlated also with behavioral parameters extracted offline. As result, the total time of interaction was highly correlated with BFI_EXT () and highly negatively correlated with the BFI_NEU (). It suggested that the more you are sociable and talkative, the more the experimental session lasted, and vice-versa. Surprisingly, the BFI_NEU trait was also negatively correlated with the frown facial expression ().
4.4. Correlation with Emotional State
The Spearman analysis reported one high correlation between the positive affection and GQ1_Conscious item () before the interaction with the robot. Similarly, the GQ2_Conscious of the robot is very highly positively correlated with the negative affects of the participants after the interaction (). It suggests that the attribution of consciousness to a robotic platform can vary with the emotional states of the person interacting with. Furthermore, the GQ2_Sensible is highly positively correlated with the total score of P2 (), suggesting that the more the person is sensible, the more he/she attributes the same condition to the robot.
The analysis between the emotional state of the participants and the perceived usability of the robot before the interaction did not report any significant correlation. On the contrary, after the interaction, the emotional state of the participants resulted very positively correlated with the disposition about the services (ITU) and the feeling of trust (TRU). Namely, the total score of P2 was highly correlated with US2_ITU_Q3 () and US2_TRU_Q7 (“I would trust in robot’s ability to perform the cognitive assessment”) (). Very high positive correlations were registered combining the positive affect and the scores of US2_ITU_Q3 and US2_TRU_Q7. It suggests that people experiencing a “high degree” of emotions (i.e., mostly positive ones) tend to trust the interacting robotic platform and its usage.
This result is also confirmed by the correlations of emotional state with the behavioral parameters. Specifically, a very high negative correlation coefficient linked the total score of the P2 with the shaking head activity (). It can be interpreted as the more the person was sensible, less frequently expressed disagreement or felt upset at the interaction.
4.5. Correlation with Cognitive State
The MMSE score of each participant was used to investigate its influence on the perception of the robot, its usability, and on the engagement during the interaction. Regarding the perception of the robot, the results highlighted a very high negative correlation coefficient between the GQ2_Conscious item and the cognitive state of the participants (). It suggests that participants with impaired cognitive capabilities tend to attribute a conscious capability to the social robot. It only occurs after the interaction with the robot, since the MMSE score does not significantly correlate with any item of the GQ1. Similarly, the cognitive state of the participants did not influence either the perceived usability either the engagement behaviors during the interaction.
This work presents a feasibility study that aims to investigate how persistent (i.e., personality) and temporal traits (i.e., emotion and engagement) of human behavior as well as the cognitive profile can influence the perception of the robot and the interaction with it. The strength of this work relies on the fact that we performed a cognitive assessment (i.e., the interaction) with a not anthropomorphic robot, ASTRO, coding basic behaviors to make it appears as social as possible (i.e., by integrating speech synthesis and lighting feedbacks). Indeed, the implemented social behaviors had an impact on the anthropomorphism of the platform. Even if the appearance is not humanlike, the robot’s ability in coaching and motivating the participants affected the perceived consciousness according to the older adults’ personality, i.e., GQ2_Consciousness score increased significantly). Additionally, our study involved a group of heterogeneous participants, both from a clinical and a social point of view, allowing us to infer some preliminary conclusions that could be deeply validated in future studies with a larger group of participants.
Regarding RQ1, the analysis reported in Section 4.2
showed a general increase of the final score for all the questionnaires administrated in the T3 phase compared to the scoring obtained in the T1 phase. It highlighted that the interaction with the robot has some positive effects on the person and robot’s perception, especially in the feeling of anxiety, which significantly decreases after the interaction. The latter finding is also confirmed by our previous work [40
The analysis of the role of personality suggests that depending on the domain/task, different traits could have an influence. Among the 5 traits assessed by the BFI, mainly Openness to experience, Agreeableness, and Neuroticism were correlated to the perception of the robot, while Openness to experience was the only trait related to the usability of the technology (i.e., the predisposition of using it). Similarly, Extraversion and Neuroticism personality traits mostly influence the modality of the interaction (i.e., total interaction time) with the robot. These results answer to RQ3 (i.e., regarding usability and perception) and partially to RQ5. Specifically, the imaginative and spontaneous traits of the elderly influence the perception of the robot, while the sociality and confidence characterizing the personality have an impact on the interaction itself. In the HRI field, usually, the role of individual traits of the Big Five model is investigated. As an example, Openness to Experience and Agreeableness are taken into account to evaluate the trust aspect of the robotic platform [41
], while Extraversion and Neuroticism are mostly related to the quality of the interaction [8
]. While the findings described in [15
] relate just Openess to experience with the attitude toward the technology, this study confirms that every facet of personality traits should be taken into account by the robot, to infer different aspects of the interaction.
By analyzing the effect of the cognitive state (i.e., MMSE score) on the interaction performance, we did not find any correlation. This result suggests that the interaction of the participants was not related to their cognitive profile (RQ5). Indeed, the behaviors of the users during the interaction were independent of the cognitive profile, while being highly related to the personality. On the other hand, the cognitive profile of the participants had some effect on the perception of the robotic system (RQ2). As reported in Section 4.5
, the highest level of consciousness was attributed to the robotic system by the most impaired participants. This result is aligned with the current trends of adopting toy robots in the later stage of dementia [45
], as a non-pharmaceutical treatment to stimulate sociality and peace [47
Regarding the role of emotions felt during the experience, the data recorded at the beginning of the interaction highlight a positive correlation between the capability of feeling strong emotions and the attribution of consciousness to the robotic platform. Additionally, people characterized by positive emotions tended to trust more on the robot’s abilities, after the interaction. From our results, an interesting aspect emerged, namely: People who enjoyed more the interaction trusted more the robot and vice-versa (RQ4). These results are in line with the assessment of the robot’s sensitivity after the interaction, too. It suggests people with a high PANAS score tend to empathize more with the robot, attributing human-like capabilities. Similarly, the presence of non-verbal behaviors (i.e., Shaking head and Nodding) highlighted the tendency of the participants to adopt the same communication they would adopt with another human being. They sympathized with the robot and perceived it as humanlike as if it would understand non-verbal behaviors (RQ5). Similar results were also described in [26
], where older people adopted humanlike social conventions with Pepper robot (i.e., touching the hands and the arms of the robot). Our result suggests that even a no-humanlike robot could incentivize social behaviors (see Table 3
Unfortunately, the consciousness of the robotic platform resulted also positively related to negative affect (NA), after the interaction. It suggests that people who perceived more the robot as conscious stated to be more nervous at the end of the interaction. It may be related to the type of interaction they had with the robot, which may create disenchantment from the previous impression or annoyance. Similarly, results reported that imaginative and curious participants were less surprised by the interaction with the robot. A similar trend is also highlighted by the behavioral parameters manually extrapolated by the videos. On average, the interactions were characterized by neutral emotions, few facial expressions, and a low quantity of motion. It may be related to the nature of the scenario, since carrying out a cognitive test requires some mental effort, and it may be demanding. As previous works suggest [48
], motivation is a key factor in cognitive assessment. The elderly must be motivated while performing it and those who administer the test must be ready to cheer and motivate the elderly at the correct time. In our study, we included some sentences of encouragement, but they may be not enough for keeping the user emotionally enthusiastic.
The main limitation of the presented work relies on the low autonomy of robot behaviors during the cognitive administration. On one hand, the personal factors were manually annotated by the psychologist or directly extrapolated by the answers to the questionnaires. We followed the procedure that is commonly adopted in the cognitive assessment scenario (see Table 2
), by including all the necessary tools for assessing the factors of interest. Given the advances in affective computing [49
] and automatic personality detection [50
] fields, we would like to adopt these solutions to improve the perception capabilities of the robotic platform. It will allow the robot to automatically infer the individual’s traits. On the other hand, cognitive administration has been implemented by employing a Wizard-Of-Oz control. Previous works on cognitive assessment and stimulation adopted available commercial cloud solutions for administrating and scoring cognitive tests automatically [15
]. The cloud approach may be a possible solution for improving robotic capabilities, but it keeps open the main problem related to the clinical validity of the results. Professional caregivers gain clinical instinct over the experiences, which guides them through the correct assessment of current pathology. This is why every study on socially assistive robotics is usually comprised of two stages: One carried out by the robot and one carried out by a human expert with the scope of validating the previous step (see Table 1
). A similar approach has been used also in this work, underling the possibility to integrate the robotic system as a support tool to the professional caregiver. To obtain a reliable framework, robots should gain some expertise that characterizes the professional caregivers, who will be always present in the loop, but that could teach the robot some repetitive tasks to enlarge the screening phase.