GREG: Emotion-Based Transfer Learning to Generate Robotic Non-verbal Social Gestures

  • Idan Vidra
  • , Aviv Yehezkel
  • , Udi Lumnitz
  • , Hadas Erel
  • , Oren Zuckerman
  • , Ariel Shamir

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper presents a Recurrent Neural Network (RNN)-based method for generating a large variety of social gestures for non-humanoid robotic devices based on a tiny sample of animator-generated gestures. Currently, the creation of robotic gestures relies on human animators, resulting in a limited number of manually designed, low-diversity gestures that are time-consuming and costly to produce. Generating a diverse range of gestures for non-humanoid robots is essential, as robots will be integrated into everyday life situations in the near future. Training an RNN model for robot gesture generation introduces challenges, such as the lack of relevant datasets for non-humanoid gestures and the absence of a direct mapping between human and robot movements. In this work, we propose a novel approach that leverages RNN and Transfer Learning to generate movement coordinates for non-verbal robotic gestures automatically. Based on a pre-trained RNN model for textual sentiment detection, we use Transfer Learning to learn the abstract “language” of robot non-verbal gestures and incorporate a text generation approach to create movement embeddings using word2vec methodology. The model is trained on two emotion categories: “happy” and “sad,” trained on its respective animations, and predicts the next movement of the robot based on previous ones. We conducted a user study, built from an “on-screen users’ evaluation”, with 66 Amazon Mechanical Turk service participants. The user study evaluated the generated gestures animation using on-screen videos and compared it to random-next-frame animation on the one hand as a baseline, and to the animator’s animations dataset on the other hand. The results indicate that our proposed model successfully generates diverse and expressive robot gestures.

Original languageEnglish
Title of host publicationHCI International 2025 Posters - 27th International Conference on Human-Computer Interaction, HCII 2025, Proceedings
EditorsConstantine Stephanidis, Margherita Antona, Stavroula Ntoa, Gavriel Salvendy
PublisherSpringer Science and Business Media Deutschland GmbH
Pages199-209
Number of pages11
ISBN (Print)9783031941672
DOIs
StatePublished - 1 Jan 2025
Externally publishedYes
Event27th International Conference on Human-Computer Interaction, HCII 2025 - Gothenburg, Sweden
Duration: 22 Jun 202527 Jun 2025

Publication series

NameCommunications in Computer and Information Science
Volume2528 CCIS
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Conference

Conference27th International Conference on Human-Computer Interaction, HCII 2025
Country/TerritorySweden
CityGothenburg
Period22/06/2527/06/25

Keywords

  • Human-Robot Interaction
  • Neural networks
  • Robotic social gestures
  • Transfer learning

ASJC Scopus subject areas

  • General Computer Science
  • General Mathematics

Fingerprint

Dive into the research topics of 'GREG: Emotion-Based Transfer Learning to Generate Robotic Non-verbal Social Gestures'. Together they form a unique fingerprint.

Cite this