Browsing by Author "Clark, Elizabeth A."
Now showing 1 - 3 of 3
Results Per Page
Sort Options
- Application of Automated Facial Expression Analysis and Facial Action Coding System to Assess Affective Response to Consumer ProductsClark, Elizabeth A. (Virginia Tech, 2020-03-17)Sensory and consumer sciences seek to comprehend the influences of sensory perception on consumer behaviors such as product liking and purchase. The food industry assesses product liking through hedonic testing but often does not capture affectual response as it pertains to product-generated (PG) and product-associated (PA) emotions. This research sought to assess the application of PA and PG emotion methodology to better understand consumer experiences. A systematic review of the existing literature was performed that focused on the Facial Action Coding System (FACS) and its use to investigate consumer affect and characterize human emotional response to product-based stimuli, which revealed inconsistencies in how FACS is carried out as well as how emotional response is inferred from Action Unit (AU) activation. Automatic Facial Expression Analysis (AFEA), which automates FACS and translates the facial muscular positioning into the basic universal emotions, was then used in a two-part study. In the first study (n=50 participants), AFEA, a Check-All-That-Apply (CATA) emotions questionnaire, and a Single-Target Implicit Association Test (ST-IAT) were used to characterize the relationship between PA as well as PG emotions and consumer behavior (acceptability, purchase intent) towards milk in various types of packaging (k=6). The ST-IAT did not yield significant PA emotions for packaged milk (p>0.05), but correspondence analysis of CATA data produced PA emotion insights including term selection based on arousal and underlying approach/withdrawal motivation related to packaging pigmentation. Time series statistical analysis of AFEA data provided increased insights on significant emotion expression, but the lack of difference (p>0.05) between certain expressed emotions that maintain no related AUs, such as happy and disgust, indicates that AFEA software may not be identifying AUs and determining emotion-based inferences in agreement with FACS. In the second study, AFEA data from the sensory evaluation (n=48 participants) of light-exposed milk stimuli (k=4) stored in packaging with various light-blocking properties) underwent time series statistical analysis to determine if the sensory-engaging nature of control stimuli could impact time series statistical analysis of AFEA data. When compared against the limited sensory engaging (blank screen) control, contempt, happy, and angry were expressed more intensely (p<0.025) and with greater incidence for the light-exposed milk stimuli; neutral was expressed exclusively in the same manner for the blank screen. Comparatively, intense neutral expression (p<0.025) was brief, fragmented, and often accompanied by intense (albeit fleeting) expressions of happy, sad, or contempt for the sensory engaging control (water); emotions such as surprised, scared, and sad were expressed similarly for the light-exposed milk stimuli. As such, it was determined that care should be taken while comparing the control and experimental stimuli in time series analysis as facial activation of muscles/AUs related to sensory perception (e.g., chewing, smelling) can impact the resulting interpretation. Collectively, the use of PA and PG emotion methodology provided additional insights on consumer-product related behaviors. However, it is hard to conclude whether AFEA is yielding emotional interpretations based on true facial expression of emotion or facial actions related to sensory perception for consumer products such as foods and beverages.
- Evaluation of Quality Parameters in Gluten-Free Bread Formulated with Breadfruit (Artocarpus altilis) FlourClark, Elizabeth A.; Aramouni, Fadi M. (Hindawi, 2018-09-24)Flour from the fruit of breadfruit trees (Artocarpus altilis) holds the potential to serve as a wheat flour replacement in gluten-free product formulations. This study evaluated the impact of breadfruit flour and leavening agent on gluten-free bread quality. Breadfruit flour was first milled and characterized by the researchers prior to being used in this study. Experimental formulas were mixed with varying breadfruit flour inclusion (0%, 20%, 35%, and 50%) and leavening agent (yeast and baking powder). Quality parameters including density, specific volume, pH, water activity, color, and texture were assessed, and proximate analysis was performed to characterize the nutritional value of the bread. Significant differences () were found in loaf density, specific volume, color (crust and ; crumb ,, and ), pH, water activity, and crumb firmness. Additionally, a consumer sensory study was performed on the most well-liked formulations. Consumer testing yielded significant differences () between the yeast-leavened control (0% breadfruit flour) and yeast-leavened breadfruit bread (20% breadfruit flour). Nonceliac consumers rated the breadfruit treatment as significantly less acceptable than the control for all sensory characteristics assessed. These results indicate that breadfruit flour can be used at ≤20%, when leavened with yeast, to produce quality gluten-free bread. Future studies should be conducted to assess the impact of breadfruit variety and milling practices on breadfruit flour properties before further attempts are made to investigate how breadfruit flour impacts the gluten-free bread quality.
- The Facial Action Coding System for Characterization of Human Affective Response to Consumer Product-Based Stimuli: A Systematic ReviewClark, Elizabeth A.; Kessinger, J'Nai; Duncan, Susan E.; Bell, Martha Ann; Lahne, Jacob; Gallagher, Daniel L.; O'Keefe, Sean F. (Frontiers, 2020-05-26)To characterize human emotions, researchers have increasingly utilized Automatic Facial Expression Analysis (AFEA), which automates the Facial Action Coding System (FACS) and translates the facial muscular positioning into the basic universal emotions. There is broad interest in the application of FACS for assessing consumer expressions as an indication of emotions to consumer product-stimuli. However, the translation of FACS to characterization of emotions is elusive in the literature. The aim of this systematic review is to give an overview of how FACS has been used to investigate human emotional behavior to consumer product-based stimuli. The search was limited to studies published in English after 1978, conducted on humans, using FACS or its action units to investigate affect, where emotional response is elicited by consumer product-based stimuli evoking at least one of the five senses. The search resulted in an initial total of 1,935 records, of which 55 studies were extracted and categorized based on the outcomes of interest including (i) method of FACS implementation; (ii) purpose of study; (iii) consumer product-based stimuli used; and (iv) measures of affect validation. Most studies implemented FACS manually (73%) to develop products and/or software (20%) and used consumer product-based stimuli that had known and/or defined capacity to evoke a particular affective response, such as films and/or movie clips (20%); minimal attention was paid to consumer products with low levels of emotional competence or with unknown affective impact. The vast majority of studies (53%) did not validate FACS-determined affect and, of the validation measures that were used, most tended to be discontinuous in nature and only captured affect as it holistically related to an experience. This review illuminated some inconsistencies in how FACS is carried out as well as how emotional response is inferred from facial muscle activation. This may prompt researchers to considermeasuring the total consumer experience by employing a variety of methodologies in addition to FACS and its emotion-based interpretation guide. Such strategies may better conceptualize consumers’ experience with products of low, unknown, and/or undefined capacity to evoke an affective response such as product prototypes, line extensions, etc.