Abstract
Variational inference is an efficient, popular heuristic used in the context of latent variable models. We provide the first analysis of instances where variational inference algorithms converge to the global optimum, in the setting of topic models. Our initializations are natural, one of them being used in LDA-c, the most popular implementation of variational inference. In addition to providing intuition into why this heuristic might work in practice, the multiplicative, rather than additive nature of the variational inference updates forces us to use non-standard proof arguments, which we believe might be of general theoretical interest.
Original language | English (US) |
---|---|
Pages (from-to) | 2098-2106 |
Number of pages | 9 |
Journal | Advances in Neural Information Processing Systems |
Volume | 2015-January |
State | Published - 2015 |
Event | 29th Annual Conference on Neural Information Processing Systems, NIPS 2015 - Montreal, Canada Duration: Dec 7 2015 → Dec 12 2015 |
All Science Journal Classification (ASJC) codes
- Computer Networks and Communications
- Information Systems
- Signal Processing