Cao, YushengLi, LingyuYuan, JiehaoJeon, Myounghoon2025-01-102025-01-102024-01-012024-08-081044-7318https://hdl.handle.net/10919/124138In-vehicle infotainment systems can cause various distractions, increasing the risk of car accidents. To address this problem, mid-air gesture systems have been introduced. This study investigated the potential of a novel interface that integrates a Head-Up Display (HUD) with auditory displays (spearcons: compressed speech) in a gesture-based menu navigation system to minimize visual distraction and improve driving and secondary task performance. The experiment involved 24 participants who navigated through 12 menu items using mid-air gestures while driving on a simulated road under four conditions: HUD (with, without spearcons) and Head-Down Display (HDD) (with, without spearcons). Results showed that the HUD condition significantly outperformed the HDD condition in participants’ level 1 situation awareness, perceived workload, menu navigation performance, and system usability. However, there were trade-offs on visual fixation duration on the menu, and lane deviation. These findings will guide future research in developing safer and more effective HUD-supported in-vehicle gesture interaction systems.15 page(s)application/pdfenIn CopyrightGesture interactionin-vehicle infotainment systemsheads-up displayauditory displayspearconsvisual distractionHead-up Displays Improve Drivers' Performance and Subjective Perceptions with the In-Vehicle Gesture Interaction SystemArticle - RefereedInternational Journal of Human-Computer Interactionhttps://doi.org/10.1080/10447318.2024.2387397ahead-of-printahead-of-printJeon, Myounghoon [0000-0003-2908-671X]1532-7590