VTechWorks staff will be away for the winter holidays starting Tuesday, December 24, 2024, through Wednesday, January 1, 2025, and will not be replying to requests during this time. Thank you for your patience, and happy holidays!
 

Automatic Processing of Musical and Phonemic Sounds: Differences Between Musicians and Nonmusicians

dc.contributor.authorAlfaro, Jennifer Nicoleen
dc.contributor.committeechairCrawford, Helen J.en
dc.contributor.committeememberHarrison, David W.en
dc.contributor.committeememberBell, Martha Annen
dc.contributor.departmentPsychologyen
dc.date.accessioned2014-03-14T20:39:33Zen
dc.date.adate2002-06-26en
dc.date.available2014-03-14T20:39:33Zen
dc.date.issued2002-04-11en
dc.date.rdate2006-06-26en
dc.date.sdate2002-06-07en
dc.description.abstractThe purpose of the present study was to examine the ability of musicians to preattentively process musical and phonemic information, as assessed by event-related potentials (ERPs), compared with nonmusicians. Participants were musicians (N=22; at least 10 years of formal training) and nonmusicians (N=22; no musical training) from the Virginia Tech community. Participants focused on a video and were instructed to ignore auditory stimuli. Simultaneous to the video presentation, auditory stimuli (60dB) in an oddball paradigm (80% standard, 20% deviant) were presented in 4 conditions (500 stimuli each): chord, phoneme, chord interval, and tone interval. EEG was recorded during each condition. The mismatch negativity (MMN) was identified by subtracting ERPs to standard auditory stimuli from ERPs to deviant auditory stimuli for each of the four qualitatively different conditions. Superior preattentive auditory processing in musicians was found most obviously during the presentation of chords, with no evidence of such superiority during phonemic processing and interval processing. As predicted, during the tone interval condition, musicians had a greater MMN peak amplitude in the central region, and had a greater MMN mean amplitude in the anterior frontal, frontal, frontocentral, and central regions. Contrary to the hypothesis, this did not emerge in the chord, phoneme, or chord interval conditions. As predicted, the MMN latency was shorter for musicians than nonmusicians in the frontocentral region during the phoneme condition. Contrary to the hypothesis, this did not emerge in the chord, chord interval, or tone interval conditions. Differential hemisphere effects were found between groups for MMN latency in the phoneme condition but not the others. Contrary to the hypotheses, no differences were found for MMN amplitude. As predicted, and consistent with Koelsch et al. (1999), musicians were more likely to exhibit an MMN than nonmusicians in the chord condition. Finally, there was the expected stronger preattentive processing in the right hemisphere MMN for the musical stimuli. Contrary to the literature, there was an unexpected stronger right hemisphere bias for phonemic stimuli.en
dc.description.degreeMaster of Scienceen
dc.identifier.otheretd-06072002-111516en
dc.identifier.sourceurlhttp://scholar.lib.vt.edu/theses/available/etd-06072002-111516/en
dc.identifier.urihttp://hdl.handle.net/10919/33464en
dc.publisherVirginia Techen
dc.relation.haspartFINALTHESIS.PDFen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectmusiciansen
dc.subjectpreattentionen
dc.subjectMMNen
dc.titleAutomatic Processing of Musical and Phonemic Sounds: Differences Between Musicians and Nonmusiciansen
dc.typeThesisen
thesis.degree.disciplinePsychologyen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
FINALTHESIS.PDF
Size:
698.22 KB
Format:
Adobe Portable Document Format

Collections