- Abrol, Farhan [Browse]
- Senior thesis
- 42 pages
- Blei, David [Browse]
- Princeton University. Department of Computer Science [Browse]
- Class year
- Summary note
- Stochastic Variational Inference  has proven to be a fast and reliable framework for inferring
posterior distributions over large corpora. One of its many applications has been to topic modeling
using the Latent Dirichlet Allocation model. However, it is prone to get stuck in local optima.
Deterministic annealing has traditionally been applied to Expectation-Maximization algorithms to
converge to better local optima by transforming the objective function with a temperature parameter.
In this paper, I apply the idea of Deterministic Annealing to Stochastic Variational Inference to help
it converge to better local optima.
I motivate the use of annealing through a statistical physics analogy and derive a general annealed
framework for stochastic variational inference. I then explore this algorithm in relation to the
Latent Dirichlet Allocation model. The results show that across various large datasets, we can
achieve better optimum quicker using annealing. The annealing procedure has free parameters
whose impact on the convergence of the algorithm were studied