Latent Facial Topics for affect analysis

Lade, P and Balasubramanian, Vineeth N and Sethuraman, Panchanathan (2013) Latent Facial Topics for affect analysis. In: IEEE International Conference on Multimedia and Expo Workshops (ICMEW), 15-19 July 2013, San Jose, CA, USA.

Full text not available from this repository. (Request a copy)


Recent years have seen a growing need in the affective computing community to understand an emotion space beyond the seven basic expressions, leading to explorations of an emotion space continuum spanned by dimensions such as valence and arousal. While there has been substantial research in the identification of facial Action Units as building blocks for the basic expressions, there is a new need to discover fine-grained facial descriptors that can explain the variations in the continuum of emotions. We propose a methodology to extract Latent Facial Topics (LFTs) from facial videos, by adapting Latent Dirichlet Allocation and supervised Latent Dirichlet Allocation topic models for facial affect analysis. In this work, we study the application of topic models to both discrete emotion recognition as well as continuous emotion prediction tasks. We show that meaningful and visualizable LFTs can be extracted and used successfully for emotion recognition. We report our recognition results on the widely known Cohn Kanade Plus and AVEC 2012 FCSC challenge data sets, which have shown promise for both discrete and continuous emotion recognition problems.

[error in script]
IITH Creators:
IITH CreatorsORCiD
Balasubramanian, Vineeth NUNSPECIFIED
Item Type: Conference or Workshop Item (Paper)
Uncontrolled Keywords: Topic models,Facial Descriptors,Emotion Recognition
Subjects: Computer science
Divisions: Department of Computer Science & Engineering
Depositing User: Library Staff
Date Deposited: 06 Sep 2019 07:11
Last Modified: 06 Sep 2019 07:14
Publisher URL:
Related URLs:

Actions (login required)

View Item View Item
Statistics for RAIITH ePrint 6132 Statistics for this ePrint Item