National Surface Transportation Safety Center for Excellence Reports (NSTSCE, VTTI)
Permanent URI for this collection
http://www.vtti.vt.edu/national/nstsce/
Browse
Browsing National Surface Transportation Safety Center for Excellence Reports (NSTSCE, VTTI) by Subject "advanced driver assistance systems (ADAS)"
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- Driver Visual Behavior While Using Adaptive Cruise Control on Commercial Motor VehiclesGrove, Kevin; Soccolich, Susan A.; Hanowski, Richard J. (National Surface Transportation Safety Center for Excellence, 2019-03-25)This study compared whether commercial motor vehicle drivers spent less time looking at the roadway while cruise control was engaged. The trucks in the study were equipped with commercially available systems that provide adaptive cruise control (ACC), which uses radar to regulate headway in addition to speed when following a lead vehicle. Three metrics were analyzed to assess drivers’ eye-glance behavior during periods of traditional cruise control usage, full ACC usage, and manual car-following: total eyes-off-road time (TEORT), durations of glances off-road, and number of glances off-road. Drivers were observed to spend less time looking at the forward roadway when cruise control was engaged. Drivers were observed to spend less time looking at the roadway when ACC was engaged compared to when manually following a lead vehicle. This difference appears to be due to the truck drivers taking longer glances away from the roadway rather than taking more frequent glances away from the roadway. These differences are important for system designers to consider, as drivers are expected to maintain their attention on the roadway while using driver assistance technologies.
- Traffic Sign Characteristics for Machine Vision Safety BenefitsKassing, Andrew; Gibbons, Ronald B.; Li, Eric; Palmer, Matthew; Hamen, Johann; Medina, Alejandra (National Surface Transportation Safety Center for Excellence, 2024-07-03)Machine vision has become a central technology for the development of automated driving systems and advanced driver assistance systems. To support safe navigation, machine vision must be able to read and interpret roadway signs, which provide regulatory, warning, and guidance information for all road users. Complicating this task, transportation agencies use a large variety of signs, which can have significantly different shapes, sizes, contents, installation methods, and retroreflectivity levels. Additionally, many environmental factors, such as precipitation, fog, dew, and lighting, also affect the visibility and legibility of roadway signs. Understanding how environmental factors and sign conditions affect machine vision performance will be important for transportation agencies to maximize the technology’s safety benefits. Research began by conducting a literature review cataloguing current research concerning roadway sign and visual performance, vehicle vision systems, and sign significance for automated driving. Information and insight gained during the literature review process informed the design and system development of data collection systems. Field data collection was then performed over the course of 3 months in late spring to early summer in 2021. Simultaneously, sign data were harvested using Google Street View and mapped using ArcGIS. Data collected during the experimental trips were then reduced and carefully prepared for analysis. Researchers conducted a thorough data analysis, particularly looking at sign location, viewing distance, sign color, font size, sun position, and illumination, to assess the impact of many environmental and infrastructure factors on the legibility of sign characters. Results showed that blue and brown signage with white legend text provided the best chance of sign character legibility during the daytime; sign characters were easy to read during the day at all three experimental distances (200, 400, and 500 ft), with small characters becoming less legible as view distance increased; daytime legibility decreased as light levels decreased; sign images captured at nighttime illumination levels had poor legibility results; sign characters on overhead signage were found to be more legible and are expected to be identified at a higher rate by vehicle vision systems; and vehicle vision systems should use a high-quality camera capable of taking pictures at night without motion blur.