We develop Markov topic models (MTMs), a novel family of generative probabilistic models that can learn topics simultaneously from multi ple corpora, such as papers from different con ferences. We apply Gaussian (Markov) random fields to model the correlations of different cor pora. MTMs capture both the internal topic structure within each corpus and the relationships between topics across the corpora. We derive an efficient estimation procedure with variational expectationmaximization. We study the perfor mance of our models on a corpus of abstracts from six different computer science conferences. Our analysis reveals qualitative discoveries that are not possible with traditional topic models, and improved quantitative performance over the