Skip to main content

Research Repository

See what's under the surface

Advanced Search

Automatic analysis of facial actions: a survey

Martinez, Brais; Valstar, Michel F.; Jiang, Bihan; Pantic, Maja

Authors

Brais Martinez

Michel F. Valstar michel.valstar@nottingham.ac.uk

Bihan Jiang

Maja Pantic



Abstract

As one of the most comprehensive and objective ways to describe facial expressions, the Facial Action Coding System (FACS) has recently received significant attention. Over the past 30 years, extensive research has been conducted by psychologists and neuroscientists on various aspects of facial expression analysis using FACS. Automating FACS coding would make this research faster and more widely applicable, opening up new avenues to understanding how we communicate through facial expressions. Such an automated process can also potentially increase the reliability, precision and temporal resolution of coding. This paper provides a comprehensive survey of research into machine analysis of facial actions. We systematically review all components of such systems: pre-processing, feature extraction and machine coding of facial actions. In addition, the existing FACS-coded facial expression databases are summarised. Finally, challenges that have to be addressed to make automatic facial action analysis applicable in real-life situations are extensively discussed. There are two underlying motivations for us to write this survey paper: the first is to provide an up-to-date review of the existing literature, and the second is to offer some insights into the future of machine recognition of facial actions: what are the challenges and opportunities that researchers in the field face.

Journal Article Type Article
Journal IEEE Transactions on Affective Computing
Print ISSN 1949-3045
Electronic ISSN 1949-3045
Publisher Institute of Electrical and Electronics Engineers
Peer Reviewed Peer Reviewed
APA6 Citation Martinez, B., Valstar, M. F., Jiang, B., & Pantic, M. (in press). Automatic analysis of facial actions: a survey. IEEE Transactions on Affective Computing, doi:10.1109/TAFFC.2017.2731763. ISSN 1949-3045
DOI https://doi.org/10.1109/TAFFC.2017.2731763
Keywords Action Unit analysis, facial expression recognition, survey
Copyright Statement Copyright information regarding this work can be found at the following address: http://eprints.nottingh.../end_user_agreement.pdf
Additional Information c) 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other users, including reprinting/ republishing this material for advertising or promotional purposes, creating new collective works for resale or redistribution to servers or lists, or reuse of any copyrighted components of this work in other works.

Composition Type:

Files

taffc-valstar-2731763-proof (2).pdf (1.5 Mb)
PDF

Copyright Statement
Copyright information regarding this work can be found at the following address: http://eprints.nottingham.ac.uk/end_user_agreement.pdf





You might also like



Downloadable Citations

;