Psychological Monitoring: Detecting Real-Time Emotional Changes
| dc.contributor.author | McClafferty, Shane R. | en |
| dc.contributor.committeechair | Scarpa-Friedman, Bruce H. | en |
| dc.contributor.committeemember | Lee, Tae-Ho | en |
| dc.contributor.committeemember | Panneton, Robin K. | en |
| dc.contributor.department | Psychology | en |
| dc.date.accessioned | 2025-11-11T15:54:53Z | en |
| dc.date.available | 2025-11-11T15:54:53Z | en |
| dc.date.issued | 2025-08-26 | en |
| dc.description.abstract | The present study introduces a novel, real-time emotion (or generalized behavioral) tracking model capable of independently estimating valence, motivation (approach-avoidance), and activation (arousal) through estimations of parasympathetic (PNS), α-adrenergic (α-SNS), and β-adrenergic (β-SNS) sympathetic nervous system activity. These separate autonomic systems correspond to distinct emotional dimensions: valence to the PNS, motivation (approach-avoidance) to the α-SNS, and activation to the β-SNS. This framework enables continuous, real-time, and interpretable estimation of emotion from a single, wearable-compatible signal through photoplethysmography (PPG). The tested model utilizes inter-beat interval (IBI) and pulse wave or pulse volume amplitude (PVA), which are tracked using an extended Kalman filter (EKF) to extract frequencies (VLF-UHF) and standard heart rate variability metrics (HRV). These features were mapped onto emotional dimensions using supervised partial least squares (PLS) regressions from behavioral validation measures: facial electromyograph (EMG) for valence, a joystick for motivation, and eye-tracking (activation). The dimensional predictions reached within-subject accuracy levels comparable to those of traditional physiological models. Additionally, these emotional dimensions can be used to produce reasonable, discrete emotion probabilities based solely on theory (without requiring training data). These findings support a new model of emotion based on separate autonomic systems and dimensions that functionally define emotions in real time. Such an approach enables dynamic emotional inference or generalized behavior across various contexts, including experimental design, clinical monitoring, and ambulatory assessment, utilizing low-cost, wearable technology. | en |
| dc.description.abstractgeneral | Emotions are the primary means by which we understand and predict the behaviors of others. However, computer and statistical models struggle to identify emotions, resulting in difficulties in explaining and predicting behavior. These issues are due to the categorical nature of emotional terms, as they are based on language (categories such as fear, anger, sadness, and happiness). Computers and mathematical models prefer quantitative estimations. Additionally, most computerized emotion estimations require relevant sensors, such as a camera, to predict emotion. Camera-based trackers require capturing the face or body, which is both uncommon and invasive. Alternatively, as smartwatches and smart rings enter the market, simpler and more accessible sensors are becoming available to estimate emotions. These devices, equipped with heart monitoring sensors, may enable real-time predictions of emotions. The present study utilizes these devices to predict dimensions of emotion such as feeling good versus feeling bad, wanting to approach versus avoid, and feeling more active versus relaxed. These dimensions can then interact to allow categorical emotion estimations such as fear, anger, sadness, or joy. These findings are a step toward emotional tracking from basic devices and may support mental health, emotional awareness, and self-regulation. | en |
| dc.description.degree | Master of Science | en |
| dc.format.medium | ETD | en |
| dc.format.mimetype | application/pdf | en |
| dc.identifier.uri | https://hdl.handle.net/10919/138961 | en |
| dc.language.iso | en | en |
| dc.publisher | Virginia Tech | en |
| dc.rights | In Copyright | en |
| dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | en |
| dc.subject | photoplethysmography (PPG) | en |
| dc.subject | heart rate variability (HRV) | en |
| dc.subject | pulse volume amplitude (PVA) | en |
| dc.subject | autonomic nervous system (ANS) | en |
| dc.subject | emotion dimensions | en |
| dc.subject | discrete emotions | en |
| dc.subject | motivation | en |
| dc.subject | approach-avoidance | en |
| dc.subject | activation | en |
| dc.subject | real-time emotion tracking | en |
| dc.subject | wearable affective computing | en |
| dc.subject | eye aspect ratio (EAR) | en |
| dc.title | Psychological Monitoring: Detecting Real-Time Emotional Changes | en |
| dc.title.alternative | Real-Time Emotional Monitoring: A Constant Dimension Approach | en |
| dc.type | Thesis | en |
| dc.type.dcmitype | Text | en |
| thesis.degree.discipline | Psychology | en |
| thesis.degree.grantor | Virginia Polytechnic Institute and State University | en |
| thesis.degree.level | masters | en |
| thesis.degree.name | Master of Science | en |