loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Author: Kart-Leong Lim

Affiliation: Nanyang Technological University, 50 Nanyang Ave 639788, Singapore

Keyword(s): Bayesian Nonparametrics, Dirichlet Process Mixture, Stochastic Gradient Ascent, Fisher Information.

Abstract: The variational inference of Bayesian mixture models such as the Dirichlet process mixture is not scalable to very large datasets, since the learning is based on computing the entire dataset each iteration. Recently, scalable version notably the stochastic variational inference, addresses this issue by performing local learning from randomly sampled batches of the full dataset or minibatch each iteration. The main problem with stochastic variational inference is that it still relies on the closed form update in variational inference to work. Stochastic gradient ascent is a modern approach to machine learning and it is widely deployed in the training of deep neural networks. It has two interesting properties. Firstly it runs on minibatch and secondly, it does not rely on closed form update to work. In this work, we explore using stochastic gradient ascent as a baseline for learning Bayesian mixture models such as Dirichlet process mixture. However, stochastic gradient ascent alone is not optimal for learning in terms of convergence. Instead, we turn our focus to stochastic gradient ascent techniques that use decaying step-size to optimize the convergence. We consider two methods here. The commonly known momentum approach and the natural gradient approach which uses an adaptive step-size through computing Fisher information. We also show that our new stochastic gradient ascent approach for training Dirichlet process mixture is compatible with deep ConvNet features and applicable to large scale datasets such as the Caltech256 and SUN397. Lastly, we justify our claims when comparing our method to an existing closed form learner for Dirichlet process mixture on these datasets. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.116.40.47

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Lim, K. (2020). Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent. In Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM; ISBN 978-989-758-397-1; ISSN 2184-4313, SciTePress, pages 33-42. DOI: 10.5220/0008944300330042

@conference{icpram20,
author={Kart{-}Leong Lim.},
title={Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent},
booktitle={Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM},
year={2020},
pages={33-42},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0008944300330042},
isbn={978-989-758-397-1},
issn={2184-4313},
}

TY - CONF

JO - Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - ICPRAM
TI - Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent
SN - 978-989-758-397-1
IS - 2184-4313
AU - Lim, K.
PY - 2020
SP - 33
EP - 42
DO - 10.5220/0008944300330042
PB - SciTePress