loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Csanád Sándor 1 ; 2 ; Szabolcs Pável 1 ; 2 and Lehel Csató 1

Affiliations: 1 Faculty of Mathematics and Informatics, Babeş-Bolyai University, Kogălniceanu 1, Cluj-Napoca, Romania ; 2 Robert Bosch SRL, Someşului 14, Cluj-Napoca, Romania

Keyword(s): Neural Network Pruning, Filter Pruning, Structured Pruning, Neural Network Acceleration.

Abstract: Neural network pruning is an effective way to reduce memory- and time requirements in most deep neural network architectures. Recently developed pruning techniques can remove individual neurons or entire filters from convolutional neural networks, making these “slim” architectures more robust and more resource- efficient. In this paper, we present a simple yet effective method that assigns probabilities to the network units – to filters in convolutional layers and to neurons in fully connected layers – and prunes them based on these values. The probabilities are learned by maximizing the expected value of a score function – calculated from the accuracy – that ranks the network when different units are tuned off. Gradients of the probabilities are estimated using Monte Carlo gradient estimation. We conduct experiments on the CIFAR-10 dataset with a small VGG-like architecture as well as on the lightweight version of the ResNet architecture. The results show that our pruning method has comparable results with different state-of-the-art algorithms in terms of parameter and floating point operation reduction. In case of the ResNet-110 architecture, our pruning method removes 72.53% of the floating point operations and 68.89% of the parameters, that marginally surpasses the result of existing pruning methods. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.145.2.184

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Sándor, C.; Pável, S. and Csató, L. (2022). Neural Network Pruning based on Filter Importance Values Approximated with Monte Carlo Gradient Estimation. In Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2022) - Volume 5: VISAPP; ISBN 978-989-758-555-5; ISSN 2184-4321, SciTePress, pages 315-322. DOI: 10.5220/0010786700003124

@conference{visapp22,
author={Csanád Sándor. and Szabolcs Pável. and Lehel Csató.},
title={Neural Network Pruning based on Filter Importance Values Approximated with Monte Carlo Gradient Estimation},
booktitle={Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2022) - Volume 5: VISAPP},
year={2022},
pages={315-322},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010786700003124},
isbn={978-989-758-555-5},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2022) - Volume 5: VISAPP
TI - Neural Network Pruning based on Filter Importance Values Approximated with Monte Carlo Gradient Estimation
SN - 978-989-758-555-5
IS - 2184-4321
AU - Sándor, C.
AU - Pável, S.
AU - Csató, L.
PY - 2022
SP - 315
EP - 322
DO - 10.5220/0010786700003124
PB - SciTePress