Emotion Recognition of Dynamic Faces in Children with Autism Spectrum Disorder

TR Number
Date
2012-05-10
Journal Title
Journal ISSN
Volume Title
Publisher
Virginia Tech
Abstract

Studies examining impaired emotion recognition and perceptual processing in autism spectrum disorders (ASD) show inconsistent results (Harms, Martin, & Wallace, 2010; Jemel, Mottron, & Dawson, 2006), and many of these studies include eye tracking data. The current study utilizes a novel task, emotion recognition of a dynamic talking face with sound, to compare children with ASD (n=8; aged 6-10, 7 male) with mental age (MA) and gender matched controls (n=8; aged 4-10, 7 male) on an emotion identification and eye tracking task. Children were asked to watch several short video clips (2.5-5 seconds) portraying the emotions of happy, sad, excited, scared, and angry and identify the emotion portrayed in the video. A mixed factorial ANOVA analysis was conducted to examine group differences in attention when viewing the stimuli. Differences in emotion identification ability were examined using a t-test and Fisher's exact tests of independence. Findings indicated that children with ASD spent less time looking at faces and the mouth region than controls. Additionally, the amount of time children with ASD spent looking at the mouth region predicted better performance on the emotion identification task. The study was underpowered; however, so these results were preliminary and require replication. Results are discussed in relation to natural processing of emotion and social stimuli.

[revised ETD per Dean DePauw 10/25/12 GMc]

Description
Keywords
eye tracking, autism, emotion recognition
Citation
Collections