Vision Based Localization of Drones in a GPS Denied Environment
dc.contributor.author | Chadha, Abhimanyu | en |
dc.contributor.committeechair | Williams, Ryan K. | en |
dc.contributor.committeemember | Abbott, A. Lynn | en |
dc.contributor.committeemember | Huang, Jia-Bin | en |
dc.contributor.department | Electrical and Computer Engineering | en |
dc.date.accessioned | 2020-09-02T08:00:35Z | en |
dc.date.available | 2020-09-02T08:00:35Z | en |
dc.date.issued | 2020-09-01 | en |
dc.description.abstract | In this thesis, we build a robust end-to-end pipeline for the localization of multiple drones in a GPS-denied environment. This pipeline would help us with cooperative formation control, autonomous delivery, search and rescue operations etc. To achieve this we integrate a custom trained YOLO (You Only Look Once) object detection network, for drones, with the ZED2 stereo camera system. With the help of this sensor we obtain a relative vector from the left camera to that drone. After calibrating it from the left camera to that drone's center of mass, we then estimate the location of all the drones in the leader drone's frame of reference. We do this by solving the localization problem with least squares estimation and thus acquire the location of the follower drone's in the leader drone's frame of reference. We present the results with the stereo camera system followed by simulations run in AirSim to verify the precision of our pipeline. | en |
dc.description.abstractgeneral | In the recent years, technologies like Deep Learning and Machine Learning have seen many rapid developments. This has lead to the rise of fields such as autonomous drones and their application in fields such as bridge inspection, search and rescue operations, disaster management relief, agriculture, real estate etc. Since GPS is a highly unreliable sensor, we need an alternate method to be able to localize the drones in various environments in real time. In this thesis, we integrate a robust drone detection neural network with a camera which estimates the location. We then use this data to get the relative location of all the follower drones from the leader drone. We run experiments with the camera and in a simulator to show the accuracy of our results. | en |
dc.description.degree | Master of Science | en |
dc.format.medium | ETD | en |
dc.identifier.other | vt_gsexam:27395 | en |
dc.identifier.uri | http://hdl.handle.net/10919/99887 | en |
dc.publisher | Virginia Tech | en |
dc.rights | In Copyright | en |
dc.rights.uri | http://rightsstatements.org/vocab/InC/1.0/ | en |
dc.subject | Autonomous UAVs | en |
dc.subject | Stereovision | en |
dc.subject | Localization | en |
dc.title | Vision Based Localization of Drones in a GPS Denied Environment | en |
dc.type | Thesis | en |
thesis.degree.discipline | Computer Engineering | en |
thesis.degree.grantor | Virginia Polytechnic Institute and State University | en |
thesis.degree.level | masters | en |
thesis.degree.name | Master of Science | en |
Files
Original bundle
1 - 1 of 1