Vision Based Localization of Drones in a GPS Denied Environment
In this thesis, we build a robust end-to-end pipeline for the localization of multiple drones in a GPS-denied environment. This pipeline would help us with cooperative formation control, autonomous delivery, search and rescue operations etc. To achieve this we integrate a custom trained YOLO (You Only Look Once) object detection network, for drones, with the ZED2 stereo camera system. With the help of this sensor we obtain a relative vector from the left camera to that drone. After calibrating it from the left camera to that drone's center of mass, we then estimate the location of all the drones in the leader drone's frame of reference. We do this by solving the localization problem with least squares estimation and thus acquire the location of the follower drone's in the leader drone's frame of reference. We present the results with the stereo camera system followed by simulations run in AirSim to verify the precision of our pipeline.