loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Jean-François Connolly ; Eric Granger and Robert Sabourin

Affiliation: Université du Québec, Canada

Keyword(s): Multi-classifier Systems, Incremental Learning, Dynamic and Multi-objective Optimization, ARTMAP Neural Networks, Video Face Recognition, Adaptive Biometrics.

Related Ontology Subjects/Areas/Topics: Applications ; Classification ; Computer Vision, Visualization and Computer Graphics ; Ensemble Methods ; Evolutionary Computation ; Image and Video Analysis ; Incremental Learning ; Pattern Recognition ; Software Engineering ; Theory and Methods ; Video Analysis

Abstract: Classification systems are often designed using a limited amount of data from complex and changing pattern recognition environments. In applications where new reference samples become available over time, adaptive multi-classifier systems (AMCSs) are desirable for updating class models. In this paper, an incremental learning strategy based on an aggregated dynamical niching particle swarm optimization (ADNPSO) algorithm is proposed to efficiently evolve heterogeneous classifier ensembles in response to new reference data. This strategy is applied to an AMCS where all parameters of a pool of fuzzy ARTMAP (FAM) neural network classifiers, each one corresponding to a PSO particle, are co-optimized such that both error rate and network size are minimized. To sustain a high level of accuracy while minimizing the computational complexity, the AMCS integrates information from multiple diverse classifiers, where learning is guided by the ADNPSO algorithm that optimizes networks according bot h these objectives. Moreover, FAM networks are evolved to maintain (1) genotype diversity of solutions around local optima in the optimization search space, and (2) phenotype diversity in the objective space. Using local Pareto optimality, networks are then stored in an archive to create a pool of base classifiers among which cost-effective ensembles are selected on the basis of accuracy, and both genotype and phenotype diversity. Performance of the ADNPSO strategy is compared against AMCSs where learning of FAM networks is guided through mono- and multi-objective optimization, and assessed under different incremental learning scenarios, where new data is extracted from real-world video streams for face recognition. Simulation results indicate that the proposed strategy provides a level of accuracy that is comparable to that of using mono-objective optimization, yet requires only a fraction of its resources. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 54.90.167.73

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Connolly, J.; Granger, E. and Sabourin, R. (2013). Evolving Classifier Ensembles using Dynamic Multi-objective Swarm Intelligence. In Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods - ICPRAM; ISBN 978-989-8565-41-9; ISSN 2184-4313, SciTePress, pages 206-215. DOI: 10.5220/0004269302060215

@conference{icpram13,
author={Jean{-}Fran\c{C}ois Connolly. and Eric Granger. and Robert Sabourin.},
title={Evolving Classifier Ensembles using Dynamic Multi-objective Swarm Intelligence},
booktitle={Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods - ICPRAM},
year={2013},
pages={206-215},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004269302060215},
isbn={978-989-8565-41-9},
issn={2184-4313},
}

TY - CONF

JO - Proceedings of the 2nd International Conference on Pattern Recognition Applications and Methods - ICPRAM
TI - Evolving Classifier Ensembles using Dynamic Multi-objective Swarm Intelligence
SN - 978-989-8565-41-9
IS - 2184-4313
AU - Connolly, J.
AU - Granger, E.
AU - Sabourin, R.
PY - 2013
SP - 206
EP - 215
DO - 10.5220/0004269302060215
PB - SciTePress