Features that have good predictive power for classes or output variables are useful features and hence most feature selection methods try to find them. However, since there may be high correlation or nonlinear dependence between such good features, we may obtain a comparable performance even when we use only a few of those good features. Thus, a feature selection method should select useful features with controlled redundancy. In this paper, we propose a novel learning method that imposes a penalty on the use of dependent/correlated features during system identification along with feature selection. This feature selection scheme can choose good features, discard indifferent, and derogatory features, and can control the level of redundancy in the set of selected features. This is probably the first attempt to feature selection with redundancy control using a fuzzy rule based framework. We have demonstrated the effectiveness of this method by utilizing a tenfold cross-validation setup on a synthetic dataset as well as on several commonly used datasets for classification problems. We have also compared our results with some state-of-the-art methods.
- Feature dependence
- feature selection with controlled redundancy
- fuzzy rule based system (FRBS)
- useful features