期刊论文详细信息
IEEE Access
GLTM: A Global and Local Word Embedding-Based Topic Model for Short Texts
Yuangang Li1  Wenxin Liang2  Xianchao Zhang3  Xinyue Liu3  Ran Feng3 
[1] School of Information Management and Engineering, Shanghai University of Finance and Economics, Shanghai, China;School of Software Engineering, Chongqing University of Posts and Telecommunications, Chongqing, China;School of Software, Dalian University of Technology, Dalian, China;
关键词: Text mining;    context modeling;    natural language processing;    topic model;    short text;   
DOI  :  10.1109/ACCESS.2018.2863260
来源: DOAJ
【 摘 要 】

Short texts have become a kind of prevalent source of information, and discovering topical information from short text collections is valuable for many applications. Due to the length limitation, conventional topic models based on document-level word co-occurrence information often fail to distill semantically coherent topics from short text collections. On the other hand, word embeddings as a powerful tool have been successfully applied in natural language processing. Word embeddings trained on large corpus are encoded with general semantic and syntactic information of words, and hence they can be leveraged to guide topic modeling for short text collections as supplementary information for sparse co-occurrence patterns. However, word embeddings are trained on large external corpus and the encoded information is not necessarily suitable for training data set of topic models, which is ignored by most existing models. In this article, we propose a novel global and local word embedding-based topic model (GLTM) for short texts. In the GLTM, we train global word embeddings from large external corpus and employ the continuous skip-gram model with negative sampling (SGNS) to obtain local word embeddings. Utilizing both the global and local word embeddings, the GLTM can distill semantic relatedness information between words which can be further leveraged by Gibbs sampler in the inference process to strengthen semantic coherence of topics. Compared with five state-of-the-art short text topic models on four real-world short text collections, the proposed GLTM exhibits the superiority in most cases.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次