loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Kunal Banerjee 1 ; Vishak Prasad C. 2 ; Rishi Raj Gupta 2 ; Kartik Vyas 2 ; Anushree H. 2 and Biswajit Mishra 2

Affiliations: 1 Walmart Global Tech, Bangalore, India ; 2 Intel Corporation, Bangalore, India

Keyword(s): Softmax, Spherical Loss, Function Approximation, Classification.

Abstract: Softmax function is widely used in artificial neural networks for multiclass classification, multilabel classification, attention mechanisms, etc. However, its efficacy is often questioned in literature. The log-softmax loss has been shown to belong to a more generic class of loss functions, called spherical family, and its member log-Taylor softmax loss is arguably the best alternative in this class. In another approach which tries to enhance the discriminative nature of the softmax function, soft-margin softmax (SM-softmax) has been proposed to be the most suitable alternative. In this work, we investigate Taylor softmax, SM-softmax and our proposed SM-Taylor softmax, an amalgamation of the earlier two functions, as alternatives to softmax function. Furthermore, we explore the effect of expanding Taylor softmax up to ten terms (original work proposed expanding only to two terms) along with the ramifications of considering Taylor softmax to be a finite or infinite series during back propagation. Our experiments for the image classification task on different datasets reveal that there is always a configuration of the SM-Taylor softmax function that outperforms the normal softmax function and its other alternatives. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.221.165.246

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Banerjee, K.; C., V.; Gupta, R.; Vyas, K.; H., A. and Mishra, B. (2021). Exploring Alternatives to Softmax Function. In Proceedings of the 2nd International Conference on Deep Learning Theory and Applications - DeLTA; ISBN 978-989-758-526-5; ISSN 2184-9277, SciTePress, pages 81-86. DOI: 10.5220/0010502000810086

@conference{delta21,
author={Kunal Banerjee. and Vishak Prasad C.. and Rishi Raj Gupta. and Kartik Vyas. and Anushree H.. and Biswajit Mishra.},
title={Exploring Alternatives to Softmax Function},
booktitle={Proceedings of the 2nd International Conference on Deep Learning Theory and Applications - DeLTA},
year={2021},
pages={81-86},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010502000810086},
isbn={978-989-758-526-5},
issn={2184-9277},
}

TY - CONF

JO - Proceedings of the 2nd International Conference on Deep Learning Theory and Applications - DeLTA
TI - Exploring Alternatives to Softmax Function
SN - 978-989-758-526-5
IS - 2184-9277
AU - Banerjee, K.
AU - C., V.
AU - Gupta, R.
AU - Vyas, K.
AU - H., A.
AU - Mishra, B.
PY - 2021
SP - 81
EP - 86
DO - 10.5220/0010502000810086
PB - SciTePress