loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Basile Tousside 1 ; Jörg Frochte 1 and Tobias Meisen 2

Affiliations: 1 Bochum University of Applied Science, Kettwiger Str. 20, 42579 Heiligenhaus, Germany ; 2 Bergische Universität Wupeprtal, Rainer-Gruenter-Straße 21, 42119 Wuppertal, Germany

Keyword(s): Deep Learning, Convolutional Neural Networks, Continual Learning, Catastrophic Forgetting.

Abstract: Learning multiple sequentially arriving tasks without forgetting previous knowledge, known as Continual Learning (CL), remains a long-standing challenge for neural networks. Most existing CL methods rely on data replay. However, they are not applicable when past data is unavailable or is not allowed to be synthetically generated. To address this challenge, we propose Sparification and Expansion-based Continual Learning (SECL). SECL avoids forgetting of previous tasks by ensuring the stability of the CNN via a stability regularization term, which prevents filters detected as important for past tasks to deviate too much when learning a new task. On top of that, SECL makes the network plastic via a plasticity regularization term that leverage the over-parameterization of CNNs to efficiently sparsify the network and tunes unimportant filters making them relevant for future tasks. Also, SECL enhances the plasticity of the network through a simple but effective heuristic mechanism that aut omatically decides when and where (at which layers) to expand the network. Experiments on popular CL vision benchmarks show that SECL leads to significant improvements over state-of-the-art method in terms of overall CL performance, as measured by classification accuracy as well as in terms of avoiding catastrophic forgetting. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.218.172.249

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Tousside, B.; Frochte, J. and Meisen, T. (2024). CNNs Sparsification and Expansion for Continual Learning. In Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART; ISBN 978-989-758-680-4; ISSN 2184-433X, SciTePress, pages 110-120. DOI: 10.5220/0012314000003636

@conference{icaart24,
author={Basile Tousside. and Jörg Frochte. and Tobias Meisen.},
title={CNNs Sparsification and Expansion for Continual Learning},
booktitle={Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART},
year={2024},
pages={110-120},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0012314000003636},
isbn={978-989-758-680-4},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 16th International Conference on Agents and Artificial Intelligence - Volume 2: ICAART
TI - CNNs Sparsification and Expansion for Continual Learning
SN - 978-989-758-680-4
IS - 2184-433X
AU - Tousside, B.
AU - Frochte, J.
AU - Meisen, T.
PY - 2024
SP - 110
EP - 120
DO - 10.5220/0012314000003636
PB - SciTePress