# Kolmogorov’s Gate Non-linearity as a Step toward Much Smaller Artificial Neural Networks

### Stanislav Selitskiy

#### 2022

#### Abstract

The deep architecture of today’s behemoth “foundation” Artificial Neural Network (ANN) models came to be not only because we can do that utilizing computational capabilities of the underlying hardware. The direction of the ANN architecture development was also set at the early stages of ANN research by using algorithms and models that proved to be effective, however limiting. The use of the small set of simple nonlinearity functions moved ANN architectures in the direction of accumulating many layers to achieve reasonable approximation in the emulation of the complex processes. Narrow efficient input domain of the activation functions also led to computational complexities of adding normalization, regularization and back, de-regularization, de- normalization layers. Such layers do not add any value to the process emulation and break the topology and memory integrity of the data. We propose to look back at forgotten shallow and wide ANN architecture to learn what we can use from then at the current state of technology. In particular, we would like to point at the Kolmogorov-Arnold theorem that has such implications for ANN architectures that, given a wide choice of volatile activation functions, even 2-layer ANN of O(n) parameters complexity and Ω(n2) relations complexity (where n is an input dimensionality), may approximate arbitrary non-linear transformation. We investigate the behaviour of the emulation of such volatile activation function using gated architecture inspired by the LSTM and GRU type cells, applied to the feed-forward fully connected ANN, on the financial time series prediction.

Download#### Paper Citation

#### in Harvard Style

Selitskiy S. (2022). **Kolmogorov’s Gate Non-linearity as a Step toward Much Smaller Artificial Neural Networks**. In *Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 2: ICEIS,* ISBN 978-989-758-569-2, pages 492-499. DOI: 10.5220/0011060700003179

#### in Bibtex Style

@conference{iceis22,

author={Stanislav Selitskiy},

title={Kolmogorov’s Gate Non-linearity as a Step toward Much Smaller Artificial Neural Networks},

booktitle={Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 2: ICEIS,},

year={2022},

pages={492-499},

publisher={SciTePress},

organization={INSTICC},

doi={10.5220/0011060700003179},

isbn={978-989-758-569-2},

}

#### in EndNote Style

TY - CONF

JO - Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 2: ICEIS,

TI - Kolmogorov’s Gate Non-linearity as a Step toward Much Smaller Artificial Neural Networks

SN - 978-989-758-569-2

AU - Selitskiy S.

PY - 2022

SP - 492

EP - 499

DO - 10.5220/0011060700003179