Browsing by Author "Lee, Daniel D."
Now showing 1 - 2 of 2
Results Per Page
Sort Options
- Autonomous Fire Suppression Using Feedback Control for Robotic FirefightingMcNeil, Joshua G. (Virginia Tech, 2016-02-04)There is an increasing demand for robotics in dangerous and extreme conditions to limit human exposure and risk. An area in which robots are being considered as a support tool is in firefighting operations to reduce the number of firefighter injuries and deaths. One such application is to increase firefighting performance through localized fire suppression. This research focused on developing an autonomous suppression system for use on a mobile robotic platform. This included a real-time close proximity fire suppression approach, appropriate feature selection and probabilistic classification of water leaks and sprays, real-time trajectory estimation, and a feedback controller for error correction in longer-range firefighting. The close proximity suppression algorithm uses IR fire detection IR stereo processing to localize a fire. Feedback of the fire size and fire target was used to manipulate the nozzle for effective placement of the suppressant onto the fire and experimentally validated with tests in high and low visibility environments. To improve performance of autonomous suppression and for inspection tasks, identification of water sprays and leaks is a critical component. Bayesian classification was used to identify the features associated with water leaks and sprays in thermal images. Appropriate first and second order features were selected by using a multi-objective genetic algorithm optimization. Four textural features were selected as a method of discriminating water sprays and leaks from other non-water, high motion objects. Water classification was implemented into a real-time suppression system as a method of determining the yaw and pitch angle of a water nozzle. Estimation of the angle orientation provided an error estimate between the current path and desired nozzle orientation. A proportional-integral (PI) controller was used to correct for forced errors in fire targeting and performance and response was shown through indoor and outdoor suppression tests with wood-crib fires. The autonomous suppression algorithm was demonstrated through fire testing to be at least three times faster compared with suppression by an operator using tele-operation.
- Rangefinding in Fire Smoke EnvironmentsStarr, Joseph Wesley (Virginia Tech, 2016-01-07)The field of robotics has advanced to the point where robots are being developed for use in fire environments to perform firefighting tasks. These environments contain varying levels of fire and smoke, both of which obstruct robotic perception sensors. In order to effectively use robots in fire environments, the issue of perception in the presence of smoke and fire needs to be addressed. The goal of this research was to address the problem of perception, specifically rangefinding, in fire smoke environments. A series of tests were performed in fire smoke filled environments to evaluate the performance of different commercial rangefinders and cameras as well as a long-wavelength infrared (LWIR) stereo vision system developed in this research. The smoke was varied from dense, low temperature smoke to light, high temperature smoke for evaluation in a range of conditions. Through small-scale experiments on eleven different sensors, radar and LWIR cameras outperformed other perception sensors within both smoke environments. A LWIR stereo vision system was developed for rangefinding and compared to radar, LIDAR, and visual stereo vision in large-scale testing, demonstrating the ability of LWIR stereo vision to rangefind in dense smoke when LIDAR and visual stereo vision fail. LWIR stereo vision was further developed for improved rangefinding in fire environments. Intensity misalignment between cameras and stereo image filtering were addressed quantitatively. Tests were performed with approximately isothermal scenes and thermally diverse scenes to select subsystem methods. In addition, the effects of image filtering on feature distortion were assessed. Rangefinding improvements were quantified with comparisons to ground truth data. Improved perception in varying levels of clear and smoke conditions was developed through sensor fusion of LWIR stereo vision and a spinning LIDAR. The data were fused in a multi-resolution 3D voxel domain using evidential theory to model occupied and free space states. A heuristic method was presented to separate significantly attenuated LIDAR returns from low-attenuation returns. Sensor models were developed for both return types and LWIR stereo vision. The fusion system was tested in a range of conditions to demonstrate its ability for improved performance over individual sensor use in fire environments.