VTechWorks staff will be away for the Thanksgiving holiday beginning at noon on Wednesday, November 27, through Friday, November 29. We will resume normal operations on Monday, December 2. Thank you for your patience.
 

Deep Transfer Learning for Vulnerable Road Users Detection using Smartphone Sensors Data

dc.contributor.authorElhenawy, Mohammeden
dc.contributor.authorAshqar, Huthaifa I.en
dc.contributor.authorMasoud, Mahmouden
dc.contributor.authorAlmannaa, Mohammed H.en
dc.contributor.authorRakotonirainy, Andryen
dc.contributor.authorRakha, Hesham A.en
dc.contributor.departmentCivil and Environmental Engineeringen
dc.date.accessioned2020-10-27T13:18:15Zen
dc.date.available2020-10-27T13:18:15Zen
dc.date.issued2020-10-25en
dc.date.updated2020-10-26T14:25:37Zen
dc.description.abstractAs the Autonomous Vehicle (AV) industry is rapidly advancing, the classification of non-motorized (vulnerable) road users (VRUs) becomes essential to ensure their safety and to smooth operation of road applications. The typical practice of non-motorized road users’ classification usually takes significant training time and ignores the temporal evolution and behavior of the signal. In this research effort, we attempt to detect VRUs with high accuracy be proposing a novel framework that includes using Deep Transfer Learning, which saves training time and cost, to classify images constructed from Recurrence Quantification Analysis (RQA) that reflect the temporal dynamics and behavior of the signal. Recurrence Plots (RPs) were constructed from low-power smartphone sensors without using GPS data. The resulted RPs were used as inputs for different pre-trained Convolutional Neural Network (CNN) classifiers including constructing 227 × 227 images to be used for AlexNet and SqueezeNet; and constructing 224 × 224 images to be used for VGG16 and VGG19. Results show that the classification accuracy of Convolutional Neural Network Transfer Learning (CNN-TL) reaches 98.70%, 98.62%, 98.71%, and 98.71% for AlexNet, SqueezeNet, VGG16, and VGG19, respectively. Moreover, we trained resnet101 and shufflenet for a very short time using one epoch of data and then used them as weak learners, which yielded 98.49% classification accuracy. The results of the proposed framework outperform other results in the literature (to the best of our knowledge) and show that using CNN-TL is promising for VRUs classification. Because of its relative straightforwardness, ability to be generalized and transferred, and potential high accuracy, we anticipate that this framework might be able to solve various problems related to signal classification.en
dc.description.versionPublished versionen
dc.format.mimetypeapplication/pdfen
dc.identifier.citationElhenawy, M.; Ashqar, H.I.; Masoud, M.; Almannaa, M.H.; Rakotonirainy, A.; Rakha, H.A. Deep Transfer Learning for Vulnerable Road Users Detection using Smartphone Sensors Data. Remote Sens. 2020, 12, 3508.en
dc.identifier.doihttps://doi.org/10.3390/rs12213508en
dc.identifier.urihttp://hdl.handle.net/10919/100718en
dc.language.isoenen
dc.publisherMDPIen
dc.rightsCreative Commons Attribution 4.0 Internationalen
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/en
dc.subjecttransportation mode classificationen
dc.subjectvulnerable road usersen
dc.subjectrecurrence plotsen
dc.subjectcomputer visionen
dc.subjectimage classification systemen
dc.titleDeep Transfer Learning for Vulnerable Road Users Detection using Smartphone Sensors Dataen
dc.title.serialRemote Sensingen
dc.typeArticle - Refereeden
dc.type.dcmitypeTexten
dc.type.dcmitypeStillImageen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
remotesensing-12-03508-v2.pdf
Size:
2.3 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 1 of 1
Name:
license.txt
Size:
0 B
Format:
Item-specific license agreed upon to submission
Description: