Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation Interfaces
Using in-vehicle infotainment systems degrades driving performance and increases crash risk. To address this, we developed air gesture interfaces using various auditory displays. Thirty-two participants drove a simulator with air-gesture menu navigation tasks. A 4x2 mixed-model design was used to explore the effects of auditory displays as a within-subjects variable (earcons, auditory icons, spearcons, and no-sound) and menu-generation interfaces as a between-subjects variable (fixed and adaptive) on driving performance, secondary task performance, eye glance, and user experience. The adaptive condition centered the menu around the user’s hand position at the moment of activation, whereas the fixed condition located the menu always at the same position. Results demonstrated that spearcons provided the least visual distraction, least workload, best system usability and was favored by participants; and that fixed menu generation outperformed adaptive menu generation in driving safety and secondary task performance. Findings will inform design guidelines for in-vehicle air-gesture interaction systems.