Current Projects

"Facial Expression Analysis by Computer Processing."
National Institute of Mental Health
5/1/06 to 4/30/11
Description: Facial expression provides cues about emotional response, regulates interpersonal behavior, and communicates aspects of psychopathology. Human-observer based methods for measuring facial expression are labor intensive, qualitative, and difficult to standardize. Supported by the NIMH (NIMH #1R01MH51435), our interdisciplinary team of computer and behavioral scientists has developed the CMU/Pitt Automated Facial Image Analysis (AFA) system that is capable of automatically recognizing facial action units and analyzing their timing in facial behavior. The quantitative measurement achieved by AFA represents a major advance over manual and subjective measurement without requiring the use of invasive sensors.
We envision to use AFA¡¦s reliable, valid, and efficient measurement of emotion expression and related nonverbal behavior for assessment of symptom severity in depression. AFA is capable of extracting both the type and timing of nonverbal indicators of depression. We hypothesize that quantitative measures of the configuration and timing of facial expression, head motion, and gaze obtainable by AFA will improve clinical assessment of symptom severity and evaluation of treatment outcomes when combined with information from interviews and self-report questionnaires.
Top

¡§Properties of Pain Expression."
Canadian Institutes of Health Research.
10/1/05 to 9/31/08
Top

¡§Collaborative Research DHB: Coordinated Motion and Facial Expression in Dyadic Conversation.¡¨
The National Science Foundation
9/1/05 to 8/31/08
Description: We consider the interplay between symmetry formation and symmetry breaking in facial expression and head movement to be integral to the process of communication. The spatiotemporal structure of the formation and breaking of symmetry is likely to be diagnostic of a variety of social and cognitive aspects of the dyadic relationship.
We hypothesize the following: (1) Perception and production of interpersonal coordination of head movements, smiles, eyebrow movements, and vocal prosody involve a process of symmetry formation and symmetry breaking between participants in dyadic conversation. (2) Coordination between individuals in conversation has contributions from a social perception-action loop [3] and from top-down cognitive control. (3) Both of these contributing elements are influenced by visual and auditory sources. (4) Vocal prosody, i.e. cadence, amplitude contour, and frequency components of vocalizations, are significant contributors to the social perception-action loop. (5) Motor activity, i.e. head motions, smiles, and eyebrow movements, are significant contributors to the social perception-action loop. We propose a model in which low level contributions from audition, vision, and proprioception are combined in a mirror system [2] that helps segment the continuous stream of auditory and visual input as one source of information available to grammatic and semantic recognition.
Top

¡§Automated Measurement of Facial Expression in Autism: Deficits in Facial Nerve Function.¡¨
Autism Speaks
10/1/07 to 9/30/2010
Top

¡§Reinforcing Effects of Alcohol during Group Formation.¡¨
National Institute of Alcohol Abuse and Alcoholism.
Description: Social factors play a major role in alcohol use and dependence, but researchers rarely study the effects in group settings. This project will systematically measure the effects of alcohol on positive and negative affect and social bonding during initial group formation, and determine if persons with personality traits related to risk for alcoholism are more sensitive to these effects.
Top

¡§Computer Assisted System to Increase Speed and Reliability of Manual FACS Coding.¡¨
RealleaR, LLC. & Naval Research Laboratory
9/1/07 to 8/31/09

¡§Mobile Examination of Physiological Cues for Automatic Behavior Recognition.¡¨
General Electric Corporation & Department of Homeland Security
1/1/08 to 12/31/09
Top

Recently Completed Projects (last five years)

¡§Psychophysiology of Risk for Depression.¡¨
National Institute of Mental Health
8/1/07 to 6/30/07
Description: Recent work in the area of behavioral inhibition and behavioral genetics suggests that psychophysiologic and genetic factors contribute to children's risk for depression. As part of a program project headed by Dr. Maria Kovacs at Western Psychiatric Institute and Clinic, the Affect Analysis Group together with Dr. Nathan Fox conducts a longitudinal study of the psychophysiology of risk for depression. A central hypothesis is that a key mechanism in the etiology and course of affective disorders is a primary deficit in the psychophysiology of emotion regulation. Participants are adults who have a history of childhood-onset depression, their siblings, comparison participants who have no history of affective disorder, and their children. Early findings suggest that course of affective disorder into adulthood maps onto distinct profiles of baseline EEG asymmetry and affective startle modulation. A major goal is to identify deficits in emotion regulation that predict the development of psychopathology in children. A unique feature of the program project is the integration of psychophysiology, socialization, family history, and molecular genetic risk factors for the development of affective disorder.
Top

¡§Multimodal Analysis of Face and Body Gesture Indicators of Communicative Inten.¡¨
Naval Research Laboratory
5/1/05 to 4/30/06
Description: Cameras have recently been placed at INS passport control booths at major airports to capture images of foreign visitors. The primary goal of these cameras is to spot terrorists on international watch-lists. While these precautions may help identify known terrorists, they provide no help in spotting those not yet known. We must also detect future terrorists. One possible way to perform this task is to develop deception detection systems that help INS agents and other security personnel spot individuals who are lying about intentions.
Interviews are a primary tool in immigration control, suspect interrogation, and intelligence gathering. The ability to detect deception and suspicious behavior is critical. While many interviewers are excellent at their job, large individual differences in interviewer performance are common. The work we propose is designed to improve interviewer performance by serving as an interactive aid. The interactive algorithms would augment existing interview capabilities, not replace them.
We will develop, evaluate, and test algorithms to extract facial-expression and body-gesture indicators of intention and deception for a remote sensor system using video sources. We also will work on integration tasks, including automatic and semi-automatic measurement of facial and body indicators of intention and deception, and partial integration of face and body features from video for the above purposes.
Top

¡§Facial Expression Analysis by Computer Processing.¡¨
National Institute of Mental Health
5/1/01 to 4/30/06.
:Description: Facial expression provides cues about emotional response, regulates interpersonal behavior, and communicates aspects of psychopathology. Human-observer based methods for measuring facial expression are labor intensive, qualitative, and difficult to standardize. Supported by the NIMH (NIMH #1R01MH51435), our interdisciplinary team of computer and behavioral scientists has developed the CMU/Pitt Automated Facial Image Analysis (AFA) system that is capable of automatically recognizing facial action units and analyzing their timing in facial behavior. The quantitative measurement achieved by AFA represents a major advance over manual and subjective measurement without requiring the use of invasive sensors.
We envision to use AFA¡¦s reliable, valid, and efficient measurement of emotion expression and related nonverbal behavior for assessment of symptom severity in depression. AFA is capable of extracting both the type and timing of nonverbal indicators of depression. We hypothesize that quantitative measures of the configuration and timing of facial expression, head motion, and gaze obtainable by AFA will improve clinical assessment of symptom severity and evaluation of treatment outcomes when combined with information from interviews and self-report questionnaires.
Top

¡§Space-Time Face- and Body Biometric for Human Identification from Video.¡¨
Defense Advanced Research Projects Agency
8/1/00 to 7/31/04
Description: In addition to the research examining automated analysis of facial expression, we are also investigating the potential of facial expression to serve as a biometric. Traditionally, biometrics have be thought of as static features of an individual (e.g., fingerprint, iris composition). We are interested in extending the findings of previous research that suggest that there are stable individual differences in the base rate, timing, and morphology of dynamic facial expressions. These measures may allow us to develop a new biometric for human identification.
Top

¡§Collaborative Proposal: Automated Measurement of Infant Facial Expressions and Human Ratings of Their Emotional Intensity.¡¨
National Science Foundation
8/01/04 to 7/31/05
Top

¡§Consortium on Nonverbal Communication for Human-Computer Interaction.¡¨
Advanced Telecommunications Research Media Integration Center, Kyoto, Japan
2/1/00 to 1/31/03
Top

¡§Parental Depression and Infant Development.¡¨
National Institute of Mental Health
4/1/97 to 3/31/02
Description: Parent-infant interaction is believed to play a central role in infant social and emotional development, including individual differences in attachment security. With Dr. Peter Lewinsohn at the Oregon Research Institute and Dr. Nick Allen at the University of Melbourne, we conduct a longitudinal study of parent-infant interaction and infant development. Probands are from a population-based cohort of young adults in western Oregon and have a history of adolescent-onset depression, other adolescent-onset disorders, or no history of adolescent-onset disorder. We are testing hypotheses about the relations between parent gender, history of depression, current symptoms, parent-infant affect, synchrony, and responsiveness, and infant socio-emotional development.
Top