Cooperative Perception in Autonomous Ground Vehicles using a Mobile Robot Testbed

TR Number

Date

2017-10-03

Journal Title

Journal ISSN

Volume Title

Publisher

Virginia Tech

Abstract

With connected and autonomous vehicles, no optimal standard or framework currently exists, outlining the right level of information sharing for cooperative autonomous driving. Cooperative Perception is proposed among vehicles, where every vehicle is transformed into a moving sensor platform that is capable of sharing information collected using its on-board sensors. This helps extend the line of sight and field of view of autonomous vehicles, which otherwise suffer from blind spots and occlusions. This increase in situational awareness promotes safe driving over a short range and improves traffic flow efficiency over a long range.

This thesis proposes a methodology for cooperative perception for autonomous vehicles over a short range. The problem of cooperative perception is broken down into sub-tasks of cooperative relative localization and map merging. Cooperative relative localization is achieved using visual and inertial sensors, where a computer-vision based camera relative pose estimation technique, augmented with position information, is used to provide a pose-fix that is subsequently updated by dead reckoning using an inertial sensor. Prior to map merging, a technique for object localization using a monocular camera is proposed that is based on the Inverse Perspective Mapping technique. A mobile multi-robot testbed was developed to emulate autonomous vehicles and the proposed method was implemented on the testbed to detect pedestrians and also to respond to the perceived hazard. Potential traffic scenarios where cooperative perception could prove crucial were tested and the results are presented in this thesis.

Description

Keywords

Autonomous Vehicles, Connected Vehicles, Cooperative Perception, Intelligent Vehicles

Citation

Collections