Vision Based Localization of Drones in a GPS Denied Environment

dc.contributor.authorChadha, Abhimanyuen
dc.contributor.committeechairWilliams, Ryan K.en
dc.contributor.committeememberAbbott, A. Lynnen
dc.contributor.committeememberHuang, Jia-Binen
dc.contributor.departmentElectrical and Computer Engineeringen
dc.date.accessioned2020-09-02T08:00:35Zen
dc.date.available2020-09-02T08:00:35Zen
dc.date.issued2020-09-01en
dc.description.abstractIn this thesis, we build a robust end-to-end pipeline for the localization of multiple drones in a GPS-denied environment. This pipeline would help us with cooperative formation control, autonomous delivery, search and rescue operations etc. To achieve this we integrate a custom trained YOLO (You Only Look Once) object detection network, for drones, with the ZED2 stereo camera system. With the help of this sensor we obtain a relative vector from the left camera to that drone. After calibrating it from the left camera to that drone's center of mass, we then estimate the location of all the drones in the leader drone's frame of reference. We do this by solving the localization problem with least squares estimation and thus acquire the location of the follower drone's in the leader drone's frame of reference. We present the results with the stereo camera system followed by simulations run in AirSim to verify the precision of our pipeline.en
dc.description.abstractgeneralIn the recent years, technologies like Deep Learning and Machine Learning have seen many rapid developments. This has lead to the rise of fields such as autonomous drones and their application in fields such as bridge inspection, search and rescue operations, disaster management relief, agriculture, real estate etc. Since GPS is a highly unreliable sensor, we need an alternate method to be able to localize the drones in various environments in real time. In this thesis, we integrate a robust drone detection neural network with a camera which estimates the location. We then use this data to get the relative location of all the follower drones from the leader drone. We run experiments with the camera and in a simulator to show the accuracy of our results.en
dc.description.degreeMaster of Scienceen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:27395en
dc.identifier.urihttp://hdl.handle.net/10919/99887en
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectAutonomous UAVsen
dc.subjectStereovisionen
dc.subjectLocalizationen
dc.titleVision Based Localization of Drones in a GPS Denied Environmenten
dc.typeThesisen
thesis.degree.disciplineComputer Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.levelmastersen
thesis.degree.nameMaster of Scienceen
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Chadha_A_T_2020.pdf
Size:
14.92 MB
Format:
Adobe Portable Document Format
Collections