Deciphering Emotional Responses to Music: A Fusion of Psychophysiological Data Analysis and Bi-LSTM Predictive Modeling

dc.contributor.authorMahat, Maheepen
dc.contributor.committeechairGracanin, Denisen
dc.contributor.committeememberKnapp, Richard Benjaminen
dc.contributor.committeememberWenskovitch, John Edwarden
dc.contributor.departmentComputer Science and#38; Applicationsen
dc.date.accessioned2024-06-11T08:00:29Zen
dc.date.available2024-06-11T08:00:29Zen
dc.date.issued2024-06-10en
dc.description.abstractThis research explores the temporal patterns of psychophysiological responses to musical excerpts by analyzing the expansive Emotion in Motion dataset, the most comprehensive of its kind. Utilizing the Dynamic Time Warping and T-test analysis techniques, we examined data from participants across seven countries who listened to three distinct musical pieces. During these listening sessions, Electrodermal Activity (EDA) and Pulse Oximetry (POX) readings were collected, complemented by qualitative feedback from the participants. Our analysis focused on detecting recurring patterns and extracting meaningful insights from the data. In addition to this, we compare several Deep Neural Networks to find the one that is best suited for prediction of emotional attributes with EDA and POX signals as input. To further facilitate a comprehensive visualization and analysis of the EDA, POX, and audio signals, we developed a dedicated platform, which features a coordinated multiple view interface, as an integral part of this work.en
dc.description.abstractgeneralWe explored how people's bodies react over time when they listen to music. We used a large collection of data called the ``Emotion in Motion'' dataset, which has information from people in seven countries who listened to three different music pieces. To understand this data, we used special tools that help detect patterns and changes in how the body responds. During the music sessions, the participant's skin's electrical activity and the amount of oxygen in the blood were recorded, which can give clues about emotional reactions. People also shared their feelings about the music. To make it easier to see and understand all this information together, we created a new web platform that simulates the experiment in real-time. This work aims to help us better understand the deep connection between music and human emotions.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:41056en
dc.identifier.urihttps://hdl.handle.net/10919/119377en
dc.language.isoenen
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectPsypchophysiological Dataen
dc.subjectEDAen
dc.subjectPOXen
dc.subjectDynamic Time Warpingen
dc.subjectLSTMen
dc.subjectBi-LSTMen
dc.titleDeciphering Emotional Responses to Music: A Fusion of Psychophysiological Data Analysis and Bi-LSTM Predictive Modelingen
dc.typeThesisen
thesis.degree.disciplineComputer Science & Applicationsen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
Mahat_M_T_2024.pdf
Size:
1.54 MB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
Mahat_M_T_2024_support_3.pdf
Size:
50.97 KB
Format:
Adobe Portable Document Format
Description:
Supporting documents

Collections