A general-purpose reduction-intensive feature selector for pattern classification
Files
TR Number
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Feature selection is a critical part of any pattern classification problem. There are many methods for selecting a good set of features. However, for problems where features must be selected from a massive set, most of these methods have accuracy rates that are very low, or computational complexities that are very high. While for some pattern classification problems it might be reasonable to reduce a massive set of features by using application specific information, in problems such as dynamic signature verification this is not possible.
Several existing feature selectors are evaluated including the Karhunen-Loeve, SELECT, exhaustive, accelerated, "n best features", sequential forward search, sequential backward search, and the "plus q - take away r" feature selection methods. Each of these methods has particular problems, making them poor candidates for selection of features from a massive set.
A General-Purpose Reduction-Intensive (GPRI) feature selector is proposed in this thesis. The GPRI feature selector reduces a large set of features to a small final feature set. The time complexity of the GPRI method is close to the "n best features" method; however, the accuracy rates (obtained with the features selected) far exceeds the "n best features" feature selector. Thus, the GPRI feature selector is a viable candidate for selecting features in general environments where little application specific information is available.