Using Variational Inference and MapReduce to Scale Topic Modeling

TitleUsing Variational Inference and MapReduce to Scale Topic Modeling
Publication TypeJournal Articles
Year of Publication2011
AuthorsZhai K, Boyd-Graber J, Asadi N
JournalarXiv:1107.3765
Date Published2011/07/19/
KeywordsComputer Science - Artificial Intelligence, Computer Science - Distributed, Parallel, and Cluster Computing
Abstract

Latent Dirichlet Allocation (LDA) is a popular topic modeling technique for exploring document collections. Because of the increasing prevalence of large datasets, there is a need to improve the scalability of inference of LDA. In this paper, we propose a technique called ~\emph{MapReduce LDA} (Mr. LDA) to accommodate very large corpus collections in the MapReduce framework. In contrast to other techniques to scale inference for LDA, which use Gibbs sampling, we use variational inference. Our solution efficiently distributes computation and is relatively simple to implement. More importantly, this variational implementation, unlike highly tuned and specialized implementations, is easily extensible. We demonstrate two extensions of the model possible with this scalable framework: informed priors to guide topic discovery and modeling topics from a multilingual corpus.

URLhttp://arxiv.org/abs/1107.3765