VTechWorks staff will be away for the Thanksgiving holiday beginning at noon on Wednesday, November 27, through Friday, November 29. We will resume normal operations on Monday, December 2. Thank you for your patience.
 

Beyond LiDAR for Unmanned Aerial Event-Based Localization in GPS Denied Environments

dc.contributor.authorMayalu Jr, Alfred Kuluaen
dc.contributor.committeechairKochersberger, Kevin B.en
dc.contributor.committeememberWilliams, Ryan K.en
dc.contributor.committeememberChristie, Gordon A.en
dc.contributor.committeememberAsbeck, Alan T.en
dc.contributor.committeememberBen-Tzvi, Pinhasen
dc.contributor.committeememberAbbott, A. Lynnen
dc.contributor.departmentElectrical and Computer Engineeringen
dc.date.accessioned2021-06-25T08:01:22Zen
dc.date.available2021-06-25T08:01:22Zen
dc.date.issued2021-06-23en
dc.description.abstractFinding lost persons, collecting information in disturbed communities, efficiently traversing urban areas after a blast or similar catastrophic events have motivated researchers to develop intelligent sensor frameworks to aid law enforcement, first responders, and military personnel with situational awareness. This dissertation consists of a two-part framework for providing situational awareness using both acoustic ground sensors and aerial sensing modalities. Ground sensors in the field of data-driven detection and classification approaches typically rely on computationally expensive inputs such as image or video-based methods [6, 91]. However, the information given by an acoustic signal offers several advantages, such as low computational needs and possible classification of occluded events including gunshots or explosions. Once an event is identified, responding to real-time events in urban areas is difficult using an Unmanned Aerial Vehicle (UAV) especially when GPS is unreliable due to coverage blackouts and/or GPS degradation [10]. Furthermore, if it is possible to deploy multiple in-situ static intelligent acoustic autonomous sensors that can identify anomalous sounds given context, then the sensors can communicate with an autonomous UAV that can navigate in a GPS-denied urban environment for investigation of the event; this could offer several advantages for time-critical and precise, localized response information necessary for life-saving decision-making. Thus, in order to implement a complete intelligent sensor framework, the need for both an intelligent static ground acoustic autonomous unattended sensors (AAUS) and improvements to GPS-degraded localization has become apparent for applications such as anomaly detection, public safety, as well as intelligence surveillance and reconnaissance (ISR) operations. Distributed AAUS networks could provide end-users with near real-time actionable information for large urban environments with limited resources. Complete ISR mission profiles require a UAV to fly in GPS challenging or denied environments such as natural or urban canyons, at least in a part of a mission. This dissertation addresses, 1) the development of intelligent sensor framework through the development of a static ground AAUS capable of machine learning for audio feature classification and 2) GPS impaired localization through a formal framework for trajectory-based flight navigation for unmanned aircraft systems (UAS) operating BVLOS in low-altitude urban airspace. Our AAUS sensor method utilizes monophonic sound event detection in which the sensor detects, records, and classifies each event utilizing supervised machine learning techniques [90]. We propose a simulated framework to enhance the performance of localization in GPS-denied environments. We do this by using a new representation of 3D geospatial data using planar features that efficiently capture the amount of information required for sensor-based GPS navigation in obstacle-rich environments. The results from this dissertation would impact both military and civilian areas of research with the ability to react to events and navigate in an urban environment.en
dc.description.abstractgeneralEmergency scenarios such as missing persons or catastrophic events in urban areas require first responders to gain situational awareness motivating researchers to investigate intelligent sensor frameworks that utilize drones for observation prompting questions such as: How can responders detect and classify acoustic anomalies using unattended sensors? and How do they remotely navigate in GPS-denied urban environments using drones to potentially investigate such an event? This dissertation addresses the first question through the development of intelligent WSN systems that can provide time-critical and precise, localized environmental information necessary for decision-making. At Virginia Tech, we have developed a static ground Acoustic Autonomous Unattended Sensor (AAUS) capable of machine learning for audio feature classification. The prior arts of intelligent AAUS and network architectures do not account for network failure, jamming capabilities, or remote scenarios in which cellular data wifi coverage are unavailable [78, 90]. Lacking a framework for such scenarios illuminates vulnerability in operational integrity for proposed solutions in homeland security applications. We address this through data ferrying, a communication method in which a mobile node, such as a drone, physically carries data as it moves through the environment to communicate with other sensor nodes on the ground. When examining the second question of navigation/investigation, concerns of safety arise in urban areas regarding drones due to GPS signal loss which is one of the first problems that can occur when a drone flies into a city (such as New York City). If this happens, potential crashes, injury and damage to property are imminent because the drone does not know where it is in space. In these GPS-denied situations traditional methods use point clouds (a set of data points in space (X,Y,Z) representing a 3D object [107]) constructed from laser radar scanners (often seen in a Microsoft Xbox Kinect sensor) to find itself. The main drawback from using methods such as these is the accumulation of error and computational complexity of large data-sets such as New York City. An advantage of cities is that they are largely flat; thus, if you can represent a building with a plane instead of 10,000 points, you can greatly reduce your data and improve algorithm performance. This dissertation addresses both the needs of an intelligent sensor framework through the development of a static ground AAUS capable of machine learning for audio feature classification as well as GPS-impaired localization through a formal framework for trajectory-based flight navigation for UAS operating BVLOS in low altitude urban and suburban environments.en
dc.description.degreeDoctor of Philosophyen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:31759en
dc.identifier.urihttp://hdl.handle.net/10919/104024en
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectAcoustic Classificationen
dc.subjectEvent Monitoringen
dc.subjectGPS-Denied Navigationen
dc.subjectDrone aircraften
dc.subjectautonomous navigationen
dc.subjectLocalizationen
dc.subjectPlanar Featuresen
dc.subject3D Mapsen
dc.subjectLIDARen
dc.titleBeyond LiDAR for Unmanned Aerial Event-Based Localization in GPS Denied Environmentsen
dc.typeDissertationen
thesis.degree.disciplineComputer Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.leveldoctoralen
thesis.degree.nameDoctor of Philosophyen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Mayalu_AK_D_2021.pdf
Size:
13.24 MB
Format:
Adobe Portable Document Format