Rangefinding in Fire Smoke Environments

dc.contributor.authorStarr, Joseph Wesleyen
dc.contributor.committeechairLeonessa, Alexanderen
dc.contributor.committeechairLattimer, Brian Y.en
dc.contributor.committeememberWicks, Alfred L.en
dc.contributor.committeememberStilwell, Daniel J.en
dc.contributor.committeememberLee, Daniel D.en
dc.contributor.departmentMechanical Engineeringen
dc.date.accessioned2016-12-21T09:02:14Zen
dc.date.available2016-12-21T09:02:14Zen
dc.date.issued2016-01-07en
dc.description.abstractThe field of robotics has advanced to the point where robots are being developed for use in fire environments to perform firefighting tasks. These environments contain varying levels of fire and smoke, both of which obstruct robotic perception sensors. In order to effectively use robots in fire environments, the issue of perception in the presence of smoke and fire needs to be addressed. The goal of this research was to address the problem of perception, specifically rangefinding, in fire smoke environments. A series of tests were performed in fire smoke filled environments to evaluate the performance of different commercial rangefinders and cameras as well as a long-wavelength infrared (LWIR) stereo vision system developed in this research. The smoke was varied from dense, low temperature smoke to light, high temperature smoke for evaluation in a range of conditions. Through small-scale experiments on eleven different sensors, radar and LWIR cameras outperformed other perception sensors within both smoke environments. A LWIR stereo vision system was developed for rangefinding and compared to radar, LIDAR, and visual stereo vision in large-scale testing, demonstrating the ability of LWIR stereo vision to rangefind in dense smoke when LIDAR and visual stereo vision fail. LWIR stereo vision was further developed for improved rangefinding in fire environments. Intensity misalignment between cameras and stereo image filtering were addressed quantitatively. Tests were performed with approximately isothermal scenes and thermally diverse scenes to select subsystem methods. In addition, the effects of image filtering on feature distortion were assessed. Rangefinding improvements were quantified with comparisons to ground truth data. Improved perception in varying levels of clear and smoke conditions was developed through sensor fusion of LWIR stereo vision and a spinning LIDAR. The data were fused in a multi-resolution 3D voxel domain using evidential theory to model occupied and free space states. A heuristic method was presented to separate significantly attenuated LIDAR returns from low-attenuation returns. Sensor models were developed for both return types and LWIR stereo vision. The fusion system was tested in a range of conditions to demonstrate its ability for improved performance over individual sensor use in fire environments.en
dc.description.degreePh. D.en
dc.format.mediumETDen
dc.identifier.othervt_gsexam:7005en
dc.identifier.urihttp://hdl.handle.net/10919/73780en
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectLong-wavelength infrareden
dc.subjectthermal infrareden
dc.subjectstereo visionen
dc.subjectfireen
dc.subjectsmokeen
dc.subjectsensor fusionen
dc.subjectLIDARen
dc.titleRangefinding in Fire Smoke Environmentsen
dc.typeDissertationen
thesis.degree.disciplineMechanical Engineeringen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.leveldoctoralen
thesis.degree.namePh. D.en

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Starr_JW_D_2016.pdf
Size:
9.58 MB
Format:
Adobe Portable Document Format