Feature Selection with Controlled Redundancy in a Fuzzy Rule Based Framework

I. Fang Chung, Yi Cheng Chen, Nikhil R. Pal*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

16 Scopus citations


Features that have good predictive power for classes or output variables are useful features and hence most feature selection methods try to find them. However, since there may be high correlation or nonlinear dependence between such good features, we may obtain a comparable performance even when we use only a few of those good features. Thus, a feature selection method should select useful features with controlled redundancy. In this paper, we propose a novel learning method that imposes a penalty on the use of dependent/correlated features during system identification along with feature selection. This feature selection scheme can choose good features, discard indifferent, and derogatory features, and can control the level of redundancy in the set of selected features. This is probably the first attempt to feature selection with redundancy control using a fuzzy rule based framework. We have demonstrated the effectiveness of this method by utilizing a tenfold cross-validation setup on a synthetic dataset as well as on several commonly used datasets for classification problems. We have also compared our results with some state-of-the-art methods.

Original languageEnglish
Pages (from-to)734-748
Number of pages15
JournalIEEE Transactions on Fuzzy Systems
Issue number2
StatePublished - Apr 2018


  • Feature dependence
  • feature selection with controlled redundancy
  • fuzzy rule based system (FRBS)
  • penalty
  • useful features


Dive into the research topics of 'Feature Selection with Controlled Redundancy in a Fuzzy Rule Based Framework'. Together they form a unique fingerprint.

Cite this