loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Pavlos Rath-Manakidis ; Hlynur Davíð Hlynsson and Laurenz Wiskott

Affiliation: Institut für Neuroinformatik, Ruhr-Universität Bochum, Bochum, Germany

Keyword(s): Deep Double Descent, Ensemble Model, Error Decomposition.

Abstract: Prediction variance on unseen data harms the generalization performance of deep neural network classifiers. We assess the utility of forming ensembles of deep neural networks in the context of double descent (DD) on image classification tasks to mitigate the effects of model variance. To that end, we propose a method for using geometric-mean based ensembling as an approximate bias-variance decomposition of a training procedure’s test error. In ensembling equivalent models we observe that ensemble formation is more beneficial the more the models are correlated with each other. Our results show that small models afford ensembles that outperform single large models while requiring considerably fewer parameters and computational steps. We offer an explanation for this phenomenon in terms of model-internal correlations. We also find that deep DD that depends on the existence of label noise can be mitigated by using ensembles of models subject to identical label noise almost as thoroughly as by ensembles of networks each trained subject to i.i.d. noise. In the context of data drift, we find that out-of-distribution performance of ensembles can be assessed by their in-distribution performance. This aids in ascertaining the utility of ensembling for generalization. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.144.113.30

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Rath-Manakidis, P.; Hlynsson, H. and Wiskott, L. (2022). Reduction of Variance-related Error through Ensembling: Deep Double Descent and Out-of-Distribution Generalization. In Proceedings of the 11th International Conference on Pattern Recognition Applications and Methods - ICPRAM; ISBN 978-989-758-549-4; ISSN 2184-4313, SciTePress, pages 31-40. DOI: 10.5220/0010821300003122

@conference{icpram22,
author={Pavlos Rath{-}Manakidis. and Hlynur Davíð Hlynsson. and Laurenz Wiskott.},
title={Reduction of Variance-related Error through Ensembling: Deep Double Descent and Out-of-Distribution Generalization},
booktitle={Proceedings of the 11th International Conference on Pattern Recognition Applications and Methods - ICPRAM},
year={2022},
pages={31-40},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010821300003122},
isbn={978-989-758-549-4},
issn={2184-4313},
}

TY - CONF

JO - Proceedings of the 11th International Conference on Pattern Recognition Applications and Methods - ICPRAM
TI - Reduction of Variance-related Error through Ensembling: Deep Double Descent and Out-of-Distribution Generalization
SN - 978-989-758-549-4
IS - 2184-4313
AU - Rath-Manakidis, P.
AU - Hlynsson, H.
AU - Wiskott, L.
PY - 2022
SP - 31
EP - 40
DO - 10.5220/0010821300003122
PB - SciTePress