Beyond LiDAR for Unmanned Aerial Event-Based Localization in GPS Denied Environments

TR Number
Date
2021-06-23
Journal Title
Journal ISSN
Volume Title
Publisher
Virginia Tech
Abstract

Finding lost persons, collecting information in disturbed communities, efficiently traversing urban areas after a blast or similar catastrophic events have motivated researchers to develop intelligent sensor frameworks to aid law enforcement, first responders, and military personnel with situational awareness. This dissertation consists of a two-part framework for providing situational awareness using both acoustic ground sensors and aerial sensing modalities. Ground sensors in the field of data-driven detection and classification approaches typically rely on computationally expensive inputs such as image or video-based methods [6, 91]. However, the information given by an acoustic signal offers several advantages, such as low computational needs and possible classification of occluded events including gunshots or explosions. Once an event is identified, responding to real-time events in urban areas is difficult using an Unmanned Aerial Vehicle (UAV) especially when GPS is unreliable due to coverage blackouts and/or GPS degradation [10].

Furthermore, if it is possible to deploy multiple in-situ static intelligent acoustic autonomous sensors that can identify anomalous sounds given context, then the sensors can communicate with an autonomous UAV that can navigate in a GPS-denied urban environment for investigation of the event; this could offer several advantages for time-critical and precise, localized response information necessary for life-saving decision-making.

Thus, in order to implement a complete intelligent sensor framework, the need for both an intelligent static ground acoustic autonomous unattended sensors (AAUS) and improvements to GPS-degraded localization has become apparent for applications such as anomaly detection, public safety, as well as intelligence surveillance and reconnaissance (ISR) operations. Distributed AAUS networks could provide end-users with near real-time actionable information for large urban environments with limited resources. Complete ISR mission profiles require a UAV to fly in GPS challenging or denied environments such as natural or urban canyons, at least in a part of a mission.

This dissertation addresses, 1) the development of intelligent sensor framework through the development of a static ground AAUS capable of machine learning for audio feature classification and 2) GPS impaired localization through a formal framework for trajectory-based flight navigation for unmanned aircraft systems (UAS) operating BVLOS in low-altitude urban airspace. Our AAUS sensor method utilizes monophonic sound event detection in which the sensor detects, records, and classifies each event utilizing supervised machine learning techniques [90]. We propose a simulated framework to enhance the performance of localization in GPS-denied environments. We do this by using a new representation of 3D geospatial data using planar features that efficiently capture the amount of information required for sensor-based GPS navigation in obstacle-rich environments. The results from this dissertation would impact both military and civilian areas of research with the ability to react to events and navigate in an urban environment.

Description
Keywords
Acoustic Classification, Event Monitoring, GPS-Denied Navigation, Drone aircraft, autonomous navigation, Localization, Planar Features, 3D Maps, LIDAR
Citation