Beyond Words: Integrating Personality Traits and Context-Driven Gestures in Human-Robot Interactions

Abstract

As robots become increasingly integrated into human life, personalizing human-robot interactions (HRI) is crucial for improving user acceptance, engagement, and interaction quality. However, personalizing HRI poses a unique challenge due to the diversity of human personality traits. This paper proposes a method that leverages large language models (LLMs) to dynamically tailor robot conversations according to the Big Five (OCEAN) personality traits. Our novelty lies in using user personality traits to shape robots’ verbal responses and implementing contextual action generation for gestures. This study addresses two primary research questions: (1) Does adapting robots’ verbal responses based on user personality traits improve communication satisfaction? (2) How does the addition of context-appropriate gestures further enhance user satisfaction? We used Goldberg’s personality trait measurement scale (1992) to assess 26 participants who engaged in conversations with an LLM-powered Pepper robot on various topics. The quality of these interactions was self-reported using a revised version of Hecht’s (1978) conversation satisfaction scale. Three experimental conditions were conducted: (i) Baseline: Standard LLM conversation, (ii) Personality-congruent: LLM-adjusted dialogue based on personality of participants, and (iii) Enhanced interaction: Personality adaptation plus dynamic gestures. For the third condition, we implemented contextually appropriate pre-defined animations and generated novel gestures by computing joint angle values in real time. Statistical analysis using ANOVA revealed significant differences in communication satisfaction across the three conditions (F=13.41, p<.001). Post-hoc analyses using Šidák’s multiple comparison test showed significant pairwise differences: Condition 2 vs. 1: Δ Δmean 4.42, p = 0.02; Condition 3 vs. 1: Δ Δmean 8.23, p < 0.01; Condition 3 vs. 2: Δ Δmean 3.80, p = 0.05. These results demonstrate that both personality-congruent interactions and non-verbal gestures significantly enhance communication satisfaction, with the combined approach yielding the highest satisfaction. This approach opens new possibilities for developing socially intelligent robots with applications in healthcare, education, and customer service.

Publication
24th International Conference on Autonomous Agents and Multiagent Systems (AAMAS ‘25)

Generated Gestures for the Pepper Robot

Figure. The Pepper robot performing generated gestures.