Show simple item record

dc.contributor.authorBukvic, Ivica Icoen
dc.contributor.authorEarle, Gregory D.en
dc.date.accessioned2020-01-03T15:07:47Zen
dc.date.available2020-01-03T15:07:47Zen
dc.date.issued2018-06en
dc.identifier.urihttp://hdl.handle.net/10919/96269en
dc.description.abstractThe following paper presents a cross-disciplinary snapshot of 21st century research in sonification and leverages the review to identify a new immersive exocentric approach to studying human capacity to perceive spatial aural cues. The paper further defines immersive exocentric sonification, highlights its unique affordances, and presents an argument for its potential to fundamentally change the way we understand and study the human capacity for location-aware audio pattern recognition. Finally, the paper describes an example of an externally funded research project that aims to tackle this newfound research whitespace.en
dc.description.sponsorshipNational Science Foundationen
dc.description.sponsorshipNSF: 1748667en
dc.language.isoen_USen
dc.publisherGeorgia Institute of Technologyen
dc.relation.ispartofInternational Conference on Auditory Display (ICAD 2018)en
dc.rightsCreative Commons Attribution-NonCommercial 3.0 United Statesen
dc.rights.urihttp://creativecommons.org/licenses/by-nc/3.0/us/en
dc.titleReimagining Human Capacity For Location-Aware Aural Pattern Recognition: A Case For Immersive Exocentric Sonificationen
dc.typePresentationen
dc.typeConference proceedingen
dc.contributor.departmentElectrical and Computer Engineeringen
dc.contributor.departmentInstitute for Creativity, Arts, and Technology (ICAT)en
dc.contributor.departmentSchool of Performing Artsen
dc.identifier.doihttps://doi.org/10.21785/icad2018.021en


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Creative Commons Attribution-NonCommercial 3.0 United States
License: Creative Commons Attribution-NonCommercial 3.0 United States