VTechWorks staff will be away for the Thanksgiving holiday beginning at noon on Wednesday, November 27, through Friday, November 29. We will resume normal operations on Monday, December 2. Thank you for your patience.
 

Increase Driving Situation Awareness and In-vehicle Gesture-based Menu Navigation Accuracy with Heads-Up Display

dc.contributor.authorCao, Yushengen
dc.contributor.committeechairJeon, Myounghoonen
dc.contributor.committeememberMccrickard, Scotten
dc.contributor.committeememberLee, Sang Wonen
dc.contributor.departmentComputer Scienceen
dc.date.accessioned2023-05-19T16:44:13Zen
dc.date.available2023-05-19T16:44:13Zen
dc.date.issued2023-04en
dc.description.abstractMore and more novel functions are being integrated into the vehicle infotainment system to allow individuals to perform secondary tasks with high accuracy and low accident risks. Mid-air gesture interactions are one of them. This thesis designed and tested a novel interface to solve a specific issue caused by this method of interaction: visual distraction within the car. In this study, a Heads-Up Display (HUD) was integrated with a gesture-based menu navigation system to allow drivers to see menu selections without looking away from the road. An experiment was conducted to investigate the potential of this system in improving drivers’ driving performance, situation awareness, and gesture interactions. The thesis recruited 24 participants to test the system. Participants provided subjective feedback about using the system and objective performance data. This thesis found that HUD significantly outperformed the Heads-Down Display (HDD) in participants’ preference, perceived workload, level 1 situation awareness, and secondary-task performance. However, to achieve this, the participants compensated by having poor driving performance and relatively longer visual distraction. This thesis will provide directions for future research and improve the overall user experience while the driver interacts with the in-vehicle gesture interaction system.en
dc.description.abstractgeneralDriving is becoming one of the essential daily activities. Unless a fully autonomous vehicle is made, driving will remain as the primary task when operating the vehicle. However, to improve the overall experience during traveling, drivers are also required to perform secondary tasks such as changing the AC, switching the music, navigating the map, and other functions. Nevertheless, car accidents may happen when drivers are performing secondary tasks because those tasks are considered a distraction from the primary task, which is driving safely. Many novel interaction methods have been implemented in a modern car, such as touch screen interaction, voice interaction, etc. This thesis introduces a new gesture interaction system that allows the user to use mid-air gestures to navigate through the secondary task menus. To further avoid visual distraction caused by the system, the gesture interaction system integrated a head-up display (HUD) to allow the user to see visual feedback on their front windshield. The HUD will let the driver use the system without looking in the other directions and keep peripheral vision on the road. The experiment recruited 24 participants to test the system. Each participant provided subjective feedback about their workload, experience, and preference. In the experiment, driving simulator was used to collect their driving performance. The eye tracker glasses were used to collect eye gaze data, and the gesture menu system was used to collect gesture system performance. This thesis expects four key factors to affect the user experience: HUD vs. Heads-Down Display (visual feedback types), with sound feedback vs. without sound feedback. Results showed that HUD helped the driver perform the secondary task faster, understand the current situation better, and reduce workload. Most of the participants preferred using the HUD over using HDD. However, there are some compensations that drivers needed to make if they use HUD: focusing on the HUD for more time while performing secondary tasks and having poor driving performance. By analyzing result data, this thesis provides a direction for conducting HUD or in-vehicle gesture interaction research and improving the users’ performance and overall experience.en
dc.description.degreeM.S.en
dc.format.mediumETDen
dc.format.mimetypeapplication/pdfen
dc.identifier.urihttp://hdl.handle.net/10919/115121en
dc.language.isoenen
dc.publisherVirginia Techen
dc.rightsCC0 1.0 Universalen
dc.rights.urihttp://creativecommons.org/publicdomain/zero/1.0/en
dc.subjectGesture interactionen
dc.subjectIn vehicleen
dc.subjectAuditory Displayen
dc.subjectHeads up displayen
dc.titleIncrease Driving Situation Awareness and In-vehicle Gesture-based Menu Navigation Accuracy with Heads-Up Displayen
dc.typeThesisen
thesis.degree.disciplineComputer Scienceen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameM.S.en

Files

Original bundle
Now showing 1 - 2 of 2
Loading...
Thumbnail Image
Name:
Cao_Y_T_2023.pdf
Size:
15.85 MB
Format:
Adobe Portable Document Format
Loading...
Thumbnail Image
Name:
Cao_Y_VT-IRB-19-923 Approval Letter_2023.pdf
Size:
73.69 KB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
Name:
license.txt
Size:
1.5 KB
Format:
Item-specific license agreed upon to submission
Description:

Collections