Reducing explicit semantic representation vectors using Latent Dirichlet Allocation
作者: Abdulgabbar SaifMohd Juzaiddin Ab AzizNazlia Omar
作者单位: 1Center for Artificial Intelligence Technology, Faculty of Information Science and Technology, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia
刊名: Knowledge-Based Systems, 2016, Vol.100 , pp.145-159
来源数据库: Elsevier Journal
DOI: 10.1016/j.knosys.2016.03.002
关键词: Semantic representationExplicit Semantic AnalysisTopic modelingKnowledge-based method
原始语种摘要: Abstract(#br)Explicit Semantic Analysis (ESA) is a knowledge-based method which builds the semantic representation of the words depending on the textual description of the concepts in the certain knowledge source. Due to its simplicity and success, ESA has received wide attention from researchers in the computational linguistics and information retrieval. However, the representation vectors formed by ESA method are generally very excessive, high dimensional, and may contain many redundant concepts. In this paper, we introduce a reduced semantic representation method that constructs the semantic interpretation of the words as the vectors over the latent topics from the original ESA representation vectors. For modeling the latent topics, the Latent Dirichlet Allocation (LDA) is adapted to...
全文获取路径: Elsevier  (合作)
分享到:
来源刊物:
影响因子:4.104 (2012)

×