Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies
dc.contributor.author | Jeon, Myounghoon | en |
dc.contributor.author | Davison, B. K. | en |
dc.contributor.author | Nees, M. A. | en |
dc.contributor.author | Wilson, J. | en |
dc.contributor.author | Walker, B. N. | en |
dc.date.accessioned | 2025-01-08T13:24:06Z | en |
dc.date.available | 2025-01-08T13:24:06Z | en |
dc.date.issued | 2009-11-09 | en |
dc.date.issued | 2009-09-21 | en |
dc.description.abstract | Auditory display research for driving has mainly focused on collision warning signals, and recent studies on auditory invehicle information presentation have examined only a limited range of tasks (e.g., cell phone operation tasks or verbal tasks such as reading digit strings). The present study used a dual task paradigm to evaluate a plausible scenario in which users navigated a song list. We applied enhanced auditory menu navigation cues, including spearcons (i.e., compressed speech) and a spindex (i.e., a speech index that used brief audio cues to communicate the user's position in a long menu list). Twentyfour undergraduates navigated through an alphabetized song list of 150 song titles-rendered as an auditory menu-while they concurrently played a simple, perceptual-motor, ball-catching game. The menu was presented with text-to-speech (TTS) alone, TTS plus one of three types of enhanced auditory cues, or no sound at all. Both performance of the primary task (success rate of the game) and the secondary task (menu search time) were better with the auditory menus than with no sound. Subjective workload scores (NASA TLX) and user preferences favored the enhanced auditory cue types. Results are discussed in terms of multiple resources theory and practical IVT design applications. | en |
dc.description.version | Published version | en |
dc.format.extent | Pages 91-98 | en |
dc.format.mimetype | application/pdf | en |
dc.identifier.doi | https://doi.org/10.1145/1620509.1620528 | en |
dc.identifier.orcid | Jeon, Myounghoon [0000-0003-2908-671X] | en |
dc.identifier.uri | https://hdl.handle.net/10919/123923 | en |
dc.language.iso | en | en |
dc.publisher | ACM | en |
dc.relation.uri | https://doi.org/10.1145/1620509.1620528 | en |
dc.rights | In Copyright | en |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | en |
dc.title | Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologies | en |
dc.title.serial | Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2009 | en |
dc.type | Conference proceeding | en |
dc.type.dcmitype | Text | en |
dc.type.other | Conference Proceeding | en |
pubs.organisational-group | Virginia Tech | en |
pubs.organisational-group | Virginia Tech/Engineering | en |
pubs.organisational-group | Virginia Tech/Engineering/Industrial and Systems Engineering | en |
pubs.organisational-group | Virginia Tech/All T&R Faculty | en |
pubs.organisational-group | Virginia Tech/Engineering/COE T&R Faculty | en |