loading
Papers Papers/2020

Research.Publish.Connect.

Paper

Authors: Farnoush Farhadi 1 ; Vahid Nia 2 and Andrea Lodi 1

Affiliations: 1 Polytechnique Montreal,2900 Edouard Montpetit Blvd, Montreal, Quebec H3T 1J4, Canada ; 2 Polytechnique Montreal,2900 Edouard Montpetit Blvd, Montreal, Quebec H3T 1J4, Canada, Huawei Noah’s Ark Lab, Montreal Research Centre, 7101 Park Avenue, Quebec H3N 1X9, Canada

ISBN: 978-989-758-397-1

ISSN: 2184-4313

Keyword(s): Activation Function, Convolution, Neural Networks.

Abstract: Many neural network architectures rely on the choice of the activation function for each hidden layer. Given the activation function, the neural network is trained over the bias and the weight parameters. The bias catches the center of the activation, and the weights capture the scale. Here we propose to train the network over a shape parameter as well. This view allows each neuron to tune its own activation function and adapt the neuron curvature towards a better prediction. This modification only adds one further equation to the back-propagation for each neuron. Re-formalizing activation functions as a comulative distribution function (cdf) generalizes the class of activation function extensively. We propose to generalizing towards extensive class of activation functions and study: i) skewness and ii) smoothness of activation functions. Here we introduce adaptive Gumbel activation function as a bridge between assymmetric Gumbel and symmetric sigmoid. A similar approach is used to i nvent a smooth version of ReLU. Our comparison with common activation functions suggests different data representation especially in early neural network layers. This adaptation also provides prediction improvement. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.229.142.104

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Farhadi, F.; Nia, V. and Lodi, A. (2020). Activation Adaptation in Neural Networks. In Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM, ISBN 978-989-758-397-1 ISSN 2184-4313, pages 249-257. DOI: 10.5220/0009175102490257

@conference{icpram20,
author={Farnoush Farhadi. and Vahid Nia. and Andrea Lodi.},
title={Activation Adaptation in Neural Networks},
booktitle={Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM,},
year={2020},
pages={249-257},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0009175102490257},
isbn={978-989-758-397-1},
issn={2184-4313},
}

TY - CONF

JO - Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM,
TI - Activation Adaptation in Neural Networks
SN - 978-989-758-397-1
IS - 2184-4313
AU - Farhadi, F.
AU - Nia, V.
AU - Lodi, A.
PY - 2020
SP - 249
EP - 257
DO - 10.5220/0009175102490257

Login or register to post comments.

Comments on this Paper: Be the first to review this paper.
0123movie.net