Classification Performance Boosting for Interpolation Kernel Machines by Training Set Pruning Using Genetic Algorithm

Jiaqi Zhang, Xiaoyi Jiang

2024

Abstract

Interpolation kernel machines belong to the class of interpolating classifiers that interpolate all the training data and thus have zero training error. Recent research shows that they do generalize well. Interpolation kernel machines have been demonstrated to be a good alternative to support vector machine and thus should be generally considered in practice. In this work we study training set pruning as a means of performance boosting. Our work is motivated from different perspectives of the curse of dimensionality. We design a genetic algorithm to perform the training set pruning. The experimental results clearly demonstrate its potential for classification performance boosting.

Download


Paper Citation


in Harvard Style

Zhang J. and Jiang X. (2024). Classification Performance Boosting for Interpolation Kernel Machines by Training Set Pruning Using Genetic Algorithm. In Proceedings of the 13th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM; ISBN 978-989-758-684-2, SciTePress, pages 428-435. DOI: 10.5220/0012467200003654


in Bibtex Style

@conference{icpram24,
author={Jiaqi Zhang and Xiaoyi Jiang},
title={Classification Performance Boosting for Interpolation Kernel Machines by Training Set Pruning Using Genetic Algorithm},
booktitle={Proceedings of the 13th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM},
year={2024},
pages={428-435},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012467200003654},
isbn={978-989-758-684-2},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 13th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM
TI - Classification Performance Boosting for Interpolation Kernel Machines by Training Set Pruning Using Genetic Algorithm
SN - 978-989-758-684-2
AU - Zhang J.
AU - Jiang X.
PY - 2024
SP - 428
EP - 435
DO - 10.5220/0012467200003654
PB - SciTePress