TY - GEN
T1 - A first evaluation study of a Database of Kinetic Facial Expressions (DaFEx)
AU - Battocchi, Alberto
AU - Pianesi, Fabio
AU - Goren-Bar, Dina
PY - 2005/12/1
Y1 - 2005/12/1
N2 - In this paper we present DaFEx (Database of Facial Expressions), a database created with the purpose of providing a benchmark for the evaluation of the facial expressivity of Embodied Conversational Agents (EGAs). DaFEx consists of 1008 short videos containing emotional facial expressions of the 6 Ekman's emotions plus the neutral expression. The facial expressions were recorded by 8 professional actors (male and female) in two acting conditions ("utterance" and "no- utterance") and at 3 intensity levels (high, medium, low). The properties of DaFEx were studied by having 80 subjects classify the emotion expressed in the videos. High rates of accuracy were obtained for most of the emotions displayed. We also tested the effect of the intensity level, of the articulatory movements due to speech, and of the actors' and subjects' gender, on classification accuracy. The results showed that decoding accuracy decreases with the intensity of emotions; that the presence of articulatory movements negatively affects the recognition of fear, surprise and of the neutral expression, while it improves the recognition of anger; and that facial expressions seem to be recognized (slightly) better when acted by actresses than by actors.
AB - In this paper we present DaFEx (Database of Facial Expressions), a database created with the purpose of providing a benchmark for the evaluation of the facial expressivity of Embodied Conversational Agents (EGAs). DaFEx consists of 1008 short videos containing emotional facial expressions of the 6 Ekman's emotions plus the neutral expression. The facial expressions were recorded by 8 professional actors (male and female) in two acting conditions ("utterance" and "no- utterance") and at 3 intensity levels (high, medium, low). The properties of DaFEx were studied by having 80 subjects classify the emotion expressed in the videos. High rates of accuracy were obtained for most of the emotions displayed. We also tested the effect of the intensity level, of the articulatory movements due to speech, and of the actors' and subjects' gender, on classification accuracy. The results showed that decoding accuracy decreases with the intensity of emotions; that the presence of articulatory movements negatively affects the recognition of fear, surprise and of the neutral expression, while it improves the recognition of anger; and that facial expressions seem to be recognized (slightly) better when acted by actresses than by actors.
KW - Databases
KW - Emotion recognition
KW - Expressiveness
KW - Quality of facial displays
KW - User study
UR - http://www.scopus.com/inward/record.url?scp=32344435702&partnerID=8YFLogxK
U2 - 10.1145/1088463.1088501
DO - 10.1145/1088463.1088501
M3 - Conference contribution
AN - SCOPUS:32344435702
SN - 1595930280
T3 - Proceedings of the Seventh International Conference on Multimodal Interfaces, ICMI'05
SP - 214
EP - 221
BT - Proceedings of the Seventh International Conference on Multimodal Interfaces, ICMI'05
T2 - Seventh International Conference on Multimodal Interfaces, ICMI'05
Y2 - 4 October 2005 through 6 October 2005
ER -