Improving the Sample-complexity of Deep Classification Networks with Invariant Integration

Matthias Rath, Matthias Rath, Alexandru Paul Condurache, Alexandru Paul Condurache

2022

Abstract

Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks. This makes them applicable to practically important use-cases where training data is scarce. Rather than being learned, this knowledge can be embedded by enforcing invariance to those transformations. Invariance can be imposed using group-equivariant convolutions followed by a pooling operation. For rotation-invariance, previous work investigated replacing the spatial pooling operation with invariant integration which explicitly constructs invariant representations. Invariant integration uses monomials which are selected using an iterative approach requiring expensive pre-training. We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems. Additionally, we replace monomials with different functions such as weighted sums, multi-layer perceptrons and self-attention, thereby streamlining the training of invariant-integration-based architectures. We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets where rotation-invariant-integration-based Wide-ResNet architectures using monomials and weighted sums outperform the respective baselines in the limited sample regime. We achieve state-of-the-art results using full data on Rotated-MNIST and SVHN where rotation is a main source of intraclass variation. On STL-10 we outperform a standard and a rotation-equivariant convolutional neural network using pooling.

Download


Paper Citation


in Harvard Style

Rath M. and Condurache A. (2022). Improving the Sample-complexity of Deep Classification Networks with Invariant Integration. In Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2022) - Volume 5: VISAPP; ISBN 978-989-758-555-5, SciTePress, pages 214-225. DOI: 10.5220/0010872000003124


in Bibtex Style

@conference{visapp22,
author={Matthias Rath and Alexandru Paul Condurache},
title={Improving the Sample-complexity of Deep Classification Networks with Invariant Integration},
booktitle={Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2022) - Volume 5: VISAPP},
year={2022},
pages={214-225},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010872000003124},
isbn={978-989-758-555-5},
}


in EndNote Style

TY - CONF

JO - Proceedings of the 17th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2022) - Volume 5: VISAPP
TI - Improving the Sample-complexity of Deep Classification Networks with Invariant Integration
SN - 978-989-758-555-5
AU - Rath M.
AU - Condurache A.
PY - 2022
SP - 214
EP - 225
DO - 10.5220/0010872000003124
PB - SciTePress