A Pareto Front Approach for Feature Selection

Enguerran Grandchamp, Mohamed Abadi, Olivier Alata


This article deals with the multi-objective aspect of an hybrid algorithm that we propose to solve the feature subset selection problem. The hybrid aspect is due to the sequence of a filter and a wrapper method. The filter method reduces the exploration space by keeping subsets having good internal properties and the wrapper method chooses among the remaining subsets with a classification performances criterion. In the filter step, the subsets are evaluated in a multi-objective way to ensure diversity within the subsets. The evaluation is based on the mutual information to estimate the dependency between features and classes and the redundancy between features within the same subset. We kept the non-dominated (Pareto optimal) subsets for the second step. In the wrapper step, the selection is made according to the stability of the subsets regarding classification performances during learning stage on a set of classifiers to avoid the specialization of the selected subsets for a given classifiers. The proposed hybrid approach is experimented on a variety of reference data sets and compared to the classical feature selection methods FSDD and mRMR. The resulting algorithm outperforms these algorithms.


  1. A. Al-Ani, M. Deriche, and J. Chebil, "A new mutual information based measure for feature selection," Intelligent Data Analysis, vol. 7, no. 1, pp. 43-57, 2003.
  2. E. Cantu-Paz, "Feature Subset Selection,Class Separability, and Genetic Algorithms," in Genetic and Evolutionary Computation, 2004, pp. 959-970.
  3. K. Deb, Multi-Objective Optimization Using Evolutionary Algorithms.: John Wiley and Sons, Chichester, 2001.
  4. C. Emmanouilidis, A. Hunter, and J. MacIntyre, "A Multiobjective Evolutionary Setting for Feature Selection and a Commonality-Based Crossover Operator," in Congress on Evolutionary Computation, California, July 2000, pp. 309-316.
  5. B.A.S. Hasan, J.Q. Gan, and Z. Qingfu, "Multi-objective evolutionary methods for channel selection in BrainComputer Interfaces: Some preliminary experimental results," in IEEE Congress Evolutionary Computation, Barcelona, Spain, July 2010, pp. 1-6.
  6. M. Hilario and A. Kalousis, "Approaches to dimensionality reduction in proteomic biomarker studies," Briefings in Bioinformatics, vol. 9, no. 2, pp. 102-118, 2008.
  7. A. Kalousis, J. Prados, and M. Hilario, "Stability of feature selection algorithms: A study on high-dimensional spaces," Knowledge and Information Systems, vol. 12, no. 1, pp. 95-116, 2007.
  8. I. Kononenko, "Estimating attributes: Analysis and extensions of RELIEF," in ECML-94, 1994, pp. 171- 182.
  9. L.I. Kuncheva, "A stability index for feature selection," in Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference: artificial intelligence and applications, Innsbruck, Austria, February 2007, pp. 390-395.
  10. N. Kwak and C.H. Choi, "Input Feature Selection by Mutual Information Based on Parzen Window," IEEE Trans. Pattern Anal. Mach. Intell, vol. 24, no. 12, pp. 1667-1671, 2002.
  11. J. Liang, S. Yang, and A-C. Winstanley, "Invariant optimal feature selection: A distance discriminant and feature ranking based solution," Pattern Recognition, vol. 41, no. 5, pp. 1429-1439, 2008.
  12. E. Parzen, "On Estimation of a Probability Density Function and Mode," The Annals of Mathematical Statistics, vol. 33, no. 3, pp. 1065-1076, 1962.
  13. H. Peng, F. Long, and C. Ding, "Feature selection based on mutual information criteria of max-dependency, maxrelevance, and min-redundancy," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 27, no. 8, pp. 1226-1238, 2005.
  14. P. Somol, J. Novovicova, and P. Pudil, "Efficient Feature Subset Selection and Subset Size Optimization," in InTech-Open Access Publisher, vol. 56, 2010, pp. 1-24.
  15. Y. Sun, S. Todorovic, and S. Goodison, "Local-LearningBased Feature Selection for High-Dimensional Data Analysis," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 9, pp. 1610-1626, 2010.
  16. L. Zhuo, J.and al., "A Genetic Algorithm based Wrapper Feature selection method for Classification of Hyperspectral Images using Support Vector Machine," in The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Part B7, vol. XXXVII, Beijing, China, 2008, pp. 397-402.

Paper Citation

in Harvard Style

Grandchamp E., Abadi M. and Alata O. (2016). A Pareto Front Approach for Feature Selection . In Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-173-1, pages 334-342. DOI: 10.5220/0005752603340342

in Bibtex Style

author={Enguerran Grandchamp and Mohamed Abadi and Olivier Alata},
title={A Pareto Front Approach for Feature Selection},
booktitle={Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},

in EndNote Style

JO - Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - A Pareto Front Approach for Feature Selection
SN - 978-989-758-173-1
AU - Grandchamp E.
AU - Abadi M.
AU - Alata O.
PY - 2016
SP - 334
EP - 342
DO - 10.5220/0005752603340342