VTechWorks staff will be away for the winter holidays starting Tuesday, December 24, 2024, through Wednesday, January 1, 2025, and will not be replying to requests during this time. Thank you for your patience, and happy holidays!
 

Rangefinding in Fire Smoke Environments

Files

TR Number

Date

2016-01-07

Journal Title

Journal ISSN

Volume Title

Publisher

Virginia Tech

Abstract

The field of robotics has advanced to the point where robots are being developed for use in fire environments to perform firefighting tasks. These environments contain varying levels of fire and smoke, both of which obstruct robotic perception sensors. In order to effectively use robots in fire environments, the issue of perception in the presence of smoke and fire needs to be addressed. The goal of this research was to address the problem of perception, specifically rangefinding, in fire smoke environments.

A series of tests were performed in fire smoke filled environments to evaluate the performance of different commercial rangefinders and cameras as well as a long-wavelength infrared (LWIR) stereo vision system developed in this research. The smoke was varied from dense, low temperature smoke to light, high temperature smoke for evaluation in a range of conditions. Through small-scale experiments on eleven different sensors, radar and LWIR cameras outperformed other perception sensors within both smoke environments. A LWIR stereo vision system was developed for rangefinding and compared to radar, LIDAR, and visual stereo vision in large-scale testing, demonstrating the ability of LWIR stereo vision to rangefind in dense smoke when LIDAR and visual stereo vision fail.

LWIR stereo vision was further developed for improved rangefinding in fire environments. Intensity misalignment between cameras and stereo image filtering were addressed quantitatively. Tests were performed with approximately isothermal scenes and thermally diverse scenes to select subsystem methods. In addition, the effects of image filtering on feature distortion were assessed. Rangefinding improvements were quantified with comparisons to ground truth data.

Improved perception in varying levels of clear and smoke conditions was developed through sensor fusion of LWIR stereo vision and a spinning LIDAR. The data were fused in a multi-resolution 3D voxel domain using evidential theory to model occupied and free space states. A heuristic method was presented to separate significantly attenuated LIDAR returns from low-attenuation returns. Sensor models were developed for both return types and LWIR stereo vision. The fusion system was tested in a range of conditions to demonstrate its ability for improved performance over individual sensor use in fire environments.

Description

Keywords

Long-wavelength infrared, thermal infrared, stereo vision, fire, smoke, sensor fusion, LIDAR

Citation