Browsing by Author "Davison, B. K."
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- Auditory menus are not just spoken visual menus: A case study of "unavailable" menu itemsJeon, Myounghoon; Gupta, S.; Davison, B. K.; Walker, B. N. (ACM, 2010-06-09)Auditory menus can supplement or replace visual menus to enhance usability and accessibility. Despite the rapid increase of research on auditory displays, more is still needed to optimize the auditory-specific aspects of these implementations. In particular, there are several menu attributes and features that are often displayed visually, but that are not or poorly conveyed in the auditory version of the menu. Here, we report on two studies aimed at determining how best to render the important concept of an unavailable menu item. In Study 1, 23 undergraduates navigated a Microsoft Word-like auditory menu with a mix of available and unavailable items. For unavailable items, using whisper was favored over attenuated voice or saying "unavailable". In Study 2, 26 undergraduates navigated a novel auditory menu. With practice, whispering unavailable items was more effective than skipping unavailable items. Results are discussed in terms of acoustic theory and cognitive menu selection theory. © 2010 Copyright is held by the author/owner(s).
- Enhanced auditory menu cues improve dual task performance and are preferred with in-vehicle technologiesJeon, Myounghoon; Davison, B. K.; Nees, M. A.; Wilson, J.; Walker, B. N. (ACM, 2009-11-09)Auditory display research for driving has mainly focused on collision warning signals, and recent studies on auditory invehicle information presentation have examined only a limited range of tasks (e.g., cell phone operation tasks or verbal tasks such as reading digit strings). The present study used a dual task paradigm to evaluate a plausible scenario in which users navigated a song list. We applied enhanced auditory menu navigation cues, including spearcons (i.e., compressed speech) and a spindex (i.e., a speech index that used brief audio cues to communicate the user's position in a long menu list). Twentyfour undergraduates navigated through an alphabetized song list of 150 song titles-rendered as an auditory menu-while they concurrently played a simple, perceptual-motor, ball-catching game. The menu was presented with text-to-speech (TTS) alone, TTS plus one of three types of enhanced auditory cues, or no sound at all. Both performance of the primary task (success rate of the game) and the secondary task (menu search time) were better with the auditory menus than with no sound. Subjective workload scores (NASA TLX) and user preferences favored the enhanced auditory cue types. Results are discussed in terms of multiple resources theory and practical IVT design applications.