Enhancing Perception Systems using V2V Sensor Fusion
Files
TR Number
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
With the surge in popularity of autonomous vehicles that depend on complex perception systems to make safety critical judgments, it is necessary to test and improve them. One of the ways this can be done is through vehicle-to-vehicle communication. This concept has been around for decades but was first standardized in 2010. Since then, there have been many hurdles in the path to applying this technology. Security, reliability, latency, and cost are the main reasons for the slow growth in this space. Another main problem is the lack of compelling applications that make overcoming these limitations worthwhile for industry. Autonomous Vehicles rely on a number of sensor types, with the most common ones being Cameras, Radars, and LiDAR. The detections from these three sensors are fused into a track list that can be used to plan and control the vehicles movements. This thesis proposes a system to introduce data from Vehicle-to-Vehicle messages into this fused track list. This extra information can be beneficial in cases when the onboard sensors are occluded or have low visibility. City and highway driving scenarios and Software-in-the-Loop testing is used to evaluate the proposed fused track list.