Classifying emotions via analysis of facial physiological response without relying on expressions

Yitzhak Yitzhaky, Shaul Shvimmer, Shlomi Talala, Rotem Simhon, Michael Gilad

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Assessing a person’s emotional state may be relevant to security in situations where it may be beneficial to assess one’s intentions or mental state. In various situations, facial expressions that often indicate emotions, may not be communicated or may not necessarily correspond to the actual emotional state. Here we review our study, in which we classify emotional states from very short facial video signals. The emotion classification process does not rely on stereotypical facial expressions or contact-based methods. Our raw data are short facial videos obtained at some different known emotional states. A facial video includes a component of diffused light from the facial skin, affected by the cardiovascular activity that might be influenced by the emotional state. From the short facial videos, we extracted unique spatiotemporal physiological-affected features employed as input features into a deep-learning model. Results show average emotion classification accuracy of about 47.36%, compared to 20% chance accuracy given 5 emotion classes, which can be considered high for the cases where expressions are hardly observed.

Original languageEnglish
Title of host publicationArtificial Intelligence for Security and Defence Applications II
EditorsHenri Bouma, Radhakrishna Prabhu, Yitzhak Yitzhaky, Hugo J. Kuijf
PublisherSPIE
ISBN (Electronic)9781510681200
DOIs
StatePublished - 1 Jan 2024
EventArtificial Intelligence for Security and Defence Applications II 2024 - Edinburgh, United Kingdom
Duration: 17 Sep 202419 Sep 2024

Publication series

NameProceedings of SPIE - The International Society for Optical Engineering
Volume13206
ISSN (Print)0277-786X
ISSN (Electronic)1996-756X

Conference

ConferenceArtificial Intelligence for Security and Defence Applications II 2024
Country/TerritoryUnited Kingdom
CityEdinburgh
Period17/09/2419/09/24

Keywords

  • Emotion classification
  • camera-based PPG
  • deep learning
  • pulsatile image
  • remote emotion recognition

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Classifying emotions via analysis of facial physiological response without relying on expressions'. Together they form a unique fingerprint.

Cite this