TY - JOUR T1 - Using Variational Inference and MapReduce to Scale Topic Modeling JF - arXiv:1107.3765 Y1 - 2011 A1 - Zhai,Ke A1 - Jordan Boyd-Graber A1 - Asadi,Nima KW - Computer Science - Artificial Intelligence KW - Computer Science - Distributed, Parallel, and Cluster Computing AB - Latent Dirichlet Allocation (LDA) is a popular topic modeling technique for exploring document collections. Because of the increasing prevalence of large datasets, there is a need to improve the scalability of inference of LDA. In this paper, we propose a technique called ~\emph{MapReduce LDA} (Mr. LDA) to accommodate very large corpus collections in the MapReduce framework. In contrast to other techniques to scale inference for LDA, which use Gibbs sampling, we use variational inference. Our solution efficiently distributes computation and is relatively simple to implement. More importantly, this variational implementation, unlike highly tuned and specialized implementations, is easily extensible. We demonstrate two extensions of the model possible with this scalable framework: informed priors to guide topic discovery and modeling topics from a multilingual corpus. UR - http://arxiv.org/abs/1107.3765 ER -