# Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent

### Kart-Leong Lim

#### 2020

#### Abstract

The variational inference of Bayesian mixture models such as the Dirichlet process mixture is not scalable to very large datasets, since the learning is based on computing the entire dataset each iteration. Recently, scalable version notably the stochastic variational inference, addresses this issue by performing local learning from randomly sampled batches of the full dataset or minibatch each iteration. The main problem with stochastic variational inference is that it still relies on the closed form update in variational inference to work. Stochastic gradient ascent is a modern approach to machine learning and it is widely deployed in the training of deep neural networks. It has two interesting properties. Firstly it runs on minibatch and secondly, it does not rely on closed form update to work. In this work, we explore using stochastic gradient ascent as a baseline for learning Bayesian mixture models such as Dirichlet process mixture. However, stochastic gradient ascent alone is not optimal for learning in terms of convergence. Instead, we turn our focus to stochastic gradient ascent techniques that use decaying step-size to optimize the convergence. We consider two methods here. The commonly known momentum approach and the natural gradient approach which uses an adaptive step-size through computing Fisher information. We also show that our new stochastic gradient ascent approach for training Dirichlet process mixture is compatible with deep ConvNet features and applicable to large scale datasets such as the Caltech256 and SUN397. Lastly, we justify our claims when comparing our method to an existing closed form learner for Dirichlet process mixture on these datasets.

Download#### Paper Citation

#### in Harvard Style

Lim K. (2020). **Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent**. In *Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,* ISBN 978-989-758-397-1, pages 33-42. DOI: 10.5220/0008944300330042

#### in Bibtex Style

@conference{icpram20,

author={Kart-Leong Lim},

title={Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent},

booktitle={Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},

year={2020},

pages={33-42},

publisher={SciTePress},

organization={INSTICC},

doi={10.5220/0008944300330042},

isbn={978-989-758-397-1},

}

#### in EndNote Style

TY - CONF

JO - Proceedings of the 9th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,

TI - Variational Inference of Dirichlet Process Mixture using Stochastic Gradient Ascent

SN - 978-989-758-397-1

AU - Lim K.

PY - 2020

SP - 33

EP - 42

DO - 10.5220/0008944300330042