loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Bihi Sabiri 1 ; Bouchra El Asri 1 and Maryem Rhanoui 2

Affiliations: 1 IMS Team, ADMIR Laboratory, Rabat IT Center, ENSIAS, Mohammed V University in Rabat, Morocco ; 2 Meridian Team, LYRICA Laboratory, School of Information Sciences, Rabat, Morocco

Keyword(s): Data Overfitting, Machine Learning, Dropout, Deep Learning, Convolutional Neural Network, Max Pooling, Early Stopping.

Abstract: The objective of a deep learning neural network is to have a final model that performs well both on the data used to train it and the new data on which the model will be used to make predictions. Overfitting refers to the fact that the predictive model produced by the machine learning algorithm adapts well to the training set. In this case, the predictive model will capture the generalizable correlations and the noise produced by the data and will be able to give very good predictions on the data of the training set, but it will predict badly on the data that it has not yet seen during his learning phase. This paper proposes two techniques among many others to reduce or prevent overfitting. Furthermore, by analyzing dynamics during training, we propose a consensus classification algorithm that avoids overfitting, we investigate the performance of these two types of techniques in convolutional neural network. Early stopping allowing to save the hyper-parameters of a model at the right time. And the dropout making the learning of the model harder allowing to gain up to more than 50% by decreasing the loss rate of the model. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.128.199.162

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Sabiri, B.; El Asri, B. and Rhanoui, M. (2022). Mechanism of Overfitting Avoidance Techniques for Training Deep Neural Networks. In Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 1: ICEIS; ISBN 978-989-758-569-2; ISSN 2184-4992, SciTePress, pages 418-427. DOI: 10.5220/0011114900003179

@conference{iceis22,
author={Bihi Sabiri. and Bouchra {El Asri}. and Maryem Rhanoui.},
title={Mechanism of Overfitting Avoidance Techniques for Training Deep Neural Networks},
booktitle={Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 1: ICEIS},
year={2022},
pages={418-427},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011114900003179},
isbn={978-989-758-569-2},
issn={2184-4992},
}

TY - CONF

JO - Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 1: ICEIS
TI - Mechanism of Overfitting Avoidance Techniques for Training Deep Neural Networks
SN - 978-989-758-569-2
IS - 2184-4992
AU - Sabiri, B.
AU - El Asri, B.
AU - Rhanoui, M.
PY - 2022
SP - 418
EP - 427
DO - 10.5220/0011114900003179
PB - SciTePress