Autonomous Navigation, Perception and Probabilistic Fire Location for an Intelligent Firefighting Robot

Files

TR Number

Date

2014-10-09

Journal Title

Journal ISSN

Volume Title

Publisher

Virginia Tech

Abstract

Firefighting robots are actively being researched to reduce firefighter injuries and deaths as well as increase their effectiveness on performing tasks. There has been difficulty in developing firefighting robots that autonomously locate a fire inside of a structure that is not in the direct robot field of view. The commonly used sensors for robots cannot properly function in fire smoke-filled environments where high temperature and zero visibility are present. Also, the existing obstacle avoidance methods have limitations calculating safe trajectories and solving local minimum problem while avoiding obstacles in real time under cluttered and dynamic environments. In addition, research for characterizing fire environments to provide firefighting robots with proper headings that lead the robots to ultimately find the fire is incomplete.

For use on intelligent firefighting robots, this research developed a real-time local obstacle avoidance method, local dynamic goal-based fire location, appropriate feature selection for fire environment assessment, and probabilistic classification of fire, smoke and their thermal reflections. The real-time local obstacle avoidance method called the weighted vector method is developed to perceive the local environment through vectors, identify suitable obstacle avoidance modes by applying a decision tree, use weighting functions to select necessary vectors and geometrically compute a safe heading. This method also solves local obstacle avoidance problems by integrating global and local goals to reach the final goal. To locate a fire outside of the robot field of view, a local dynamic goal-based 'Seek-and-Find' fire algorithm was developed by fusing long wave infrared camera images, ultraviolet radiation sensor and Lidar. The weighted vector method was applied to avoid complex static and unexpected dynamic obstacles while moving toward the fire. This algorithm was successfully validated for a firefighting robot to autonomously navigate to find a fire outside the field of view.

An improved 'Seek-and-Find' fire algorithm was developed using Bayesian classifiers to identify fire features using thermal images. This algorithm was able to discriminate fire and smoke from thermal reflections and other hot objects, allowing the prediction of a more robust heading for the robot. To develop this algorithm, appropriate motion and texture features that can accurately identify fire and smoke from their reflections were analyzed and selected by using multi-objective genetic algorithm optimization. As a result, mean and variance of intensity, entropy and inverse difference moment in the first and second order statistical texture features were determined to probabilistically classify fire, smoke, their thermal reflections and other hot objects simultaneously. This classification performance was measured to be 93.2% accuracy based on validation using the test dataset not included in the original training dataset. In addition, the precision, recall, F-measure, and G-measure were 93.5 - 99.9% for classifying fire and smoke using the test dataset.

Description

Keywords

Firefighting robot, Bayesian Classification

Citation