Bagging KNN Classifiers using Different Expert Fusion Strategies

Amer. J. AlBaghdadi, Fuad M. Alkoot

2005

Abstract

An experimental evaluation of Bagging K-nearest neighbor classifiers (KNN) is performed. The goal is to investigate whether varying soft methods of aggregation would yield better results than Sum and Vote. We evaluate the performance of Sum, Product, MProduct, Minimum, Maximum, Median and Vote under varying parameters. The results over different training set sizes show minor improvement due to combining using Sum and MProduct. At very small sample size no improvement is achieved from bagging KNN classifiers. While Minimum and Maximum do not improve at almost any training set size, Vote and Median showed an improvement when larger training set sizes were tested. Reducing the number of features at large training set size improved the performance of the leading fusion strategies.

References

  1. F. M. Alkoot and J. Kittler. Improving the performance of the product fusion strategy. In Proceedings of the ICPR 2000 conference, Barcelona, Spain, 9 2000.
  2. E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting and varients. Machine Learning, pages 1-38, 1998.
  3. C.L. Blake and C.J. Merz. UCI repository of machine learning databases, 1998.
  4. L. Breiman. Bagging predictors. Machine Learning, 24:123-140, 1996.
  5. L. Breiman. Bias, variance and arcing classifiers. Technical Report 460, Statistics Department, University of California at Berkeley, 4 1996.
  6. P. deChazal and B. Celler. Improving ecg diagnostic classification by combining multiple neural networks. In Computers in Cardiology, volume 24, pages 473-476. IEEE, 9 1997.
  7. T. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning, pages 1-22, 1998.
  8. B. Efron and R. Tibshirani. An Introduction to the Bootstrap. Chapman and Hall, 1993.
  9. T. Heskes. Balancing between bagging and bumping. In M. Mozer, M. Jordan, and T. Petsche, editors, Advances in Neural Information Processing Systems 9, pages 466-472. MIT press, 1997.
  10. J. Kittler. Combining classifiers: A theoretical framework. Pattern Analysis and Applications, 1:18-27, 1998.
  11. J. Kittler, M. Hatef, R. Duin, and J. Matas. On combining classifiers. IEEE Trans Pattern Analysis and Machine Intelligence, 20(3):226-239, 1998.
  12. D.W. Opitz and R. F. Maclin. An empirical evaluation of bagging and boosting for artificial neural networks. In IEEE International Conference on Neural Networks - Conference Proceedings, volume 3, pages 1401-1405, Univ of Montana, Missoula, MT, USA, 6 1997. IEEE.
  13. J. Quinlan. Bagging, boosting and c4.5. In Proceedings of the 13th National Conference on Artificial Intelligence, volume 1, pages 725-730, Portland, OR, USA,, 8 1996. AAAI, Menlo Park, CA, USA.
Download


Paper Citation


in Harvard Style

J. AlBaghdadi A. and M. Alkoot F. (2005). Bagging KNN Classifiers using Different Expert Fusion Strategies . In Proceedings of the 5th International Workshop on Pattern Recognition in Information Systems - Volume 1: PRIS, (ICEIS 2005) ISBN 972-8865-28-7, pages 219-224. DOI: 10.5220/0002572002190224


in Bibtex Style

@conference{pris05,
author={Amer. J. AlBaghdadi and Fuad M. Alkoot},
title={Bagging KNN Classifiers using Different Expert Fusion Strategies},
booktitle={Proceedings of the 5th International Workshop on Pattern Recognition in Information Systems - Volume 1: PRIS, (ICEIS 2005)},
year={2005},
pages={219-224},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002572002190224},
isbn={972-8865-28-7},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 5th International Workshop on Pattern Recognition in Information Systems - Volume 1: PRIS, (ICEIS 2005)
TI - Bagging KNN Classifiers using Different Expert Fusion Strategies
SN - 972-8865-28-7
AU - J. AlBaghdadi A.
AU - M. Alkoot F.
PY - 2005
SP - 219
EP - 224
DO - 10.5220/0002572002190224