DNN Pruning and Its Effects on Robustness

Sven Mantowksy, Firas Mualla, Saqib Sayad Bukhari, Georg Schneider

2023

Abstract

The popularity of deep neural networks (DNNs) and their application on embedded systems and edge devices is increasing rapidly. Most embedded systems are limited in their computational capabilities and memory space. To meet these restrictions, the DNNs need to be compressed while keeping their accuracy, for instance, by pruning the least important neurons or filters. However, the pruning may introduce other effects on the model, such as influencing the robustness of its predictions. To analyze the impact of pruning on the model robustness, we employ two metrics: heatmap based correlation coefficient (HCC) and expected calibration error (ECE). Using the HCC, on one hand it is possible to gain insight to which extent a model and its compressed version tend to use the same input features. On the other hand, using the difference in the ECE between a model and its compressed version, we can analyze the side effect of pruning on the model’s decision reliability. The experiments were conducted for image classification and object detection problems. For both types of issues, our results show that some off-the-shelf pruning methods considerably improve the model calibration without being specifically designed for this purpose. For instance, the ECE of a VGG16 classifier is improved by 35% after being compressed by 50% using the H-Rank pruning method with a negligible loss in accuracy. Larger compression ratios reduce the accuracy as expected but may improve the calibration drastically (e.g. ECE is reduced by 77% under a compression ratio of 70%). Moreover, the HCC measures feature saliency under model compression and tends to correlate as expected positively with the model’s accuracy. The proposed metrics can be employed for comparing pruning methods from another perspective than the commonly considered trade-off between the accuracy and compression ratio.

Download


Paper Citation


in Harvard Style

Mantowksy S., Mualla F., Sayad Bukhari S. and Schneider G. (2023). DNN Pruning and Its Effects on Robustness. In Proceedings of the 12th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-626-2, pages 82-88. DOI: 10.5220/0011651000003411


in Bibtex Style

@conference{icpram23,
author={Sven Mantowksy and Firas Mualla and Saqib Sayad Bukhari and Georg Schneider},
title={DNN Pruning and Its Effects on Robustness},
booktitle={Proceedings of the 12th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2023},
pages={82-88},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011651000003411},
isbn={978-989-758-626-2},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 12th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - DNN Pruning and Its Effects on Robustness
SN - 978-989-758-626-2
AU - Mantowksy S.
AU - Mualla F.
AU - Sayad Bukhari S.
AU - Schneider G.
PY - 2023
SP - 82
EP - 88
DO - 10.5220/0011651000003411