loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Yunqing Xia 1 ; Guoyu Tang 1 ; Huan Zhao 1 ; Erik Cambria 2 and Thomas Fang Zheng 1

Affiliations: 1 Tsinghua University, China ; 2 National University of Singapore, Singapore

Keyword(s): Topic Modeling, LDA, Latent Variable, Word Sense.

Related Ontology Subjects/Areas/Topics: Applications ; Artificial Intelligence ; Knowledge Engineering and Ontology Development ; Knowledge-Based Systems ; Natural Language Processing ; Pattern Recognition ; Symbolic Systems

Abstract: Since proposed, LDA have been successfully used in modeling text documents. So far, words are the common features to induce latent topic, which are later used in document representation. Observation on documents indicates that the polysemous words can make the latent topics less discriminative, resulting in less accurate document representation. We thus argue that the semantically deterministic word senses can improve quality of the latent topics. In this work, we proposes a series of word sense aware LDA models which use word sense as an extra latent variable in topic induction. Preliminary experiments on document clustering on benchmark datasets show that word sense can indeed improve topic modeling.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 18.226.222.12

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Xia, Y.; Tang, G.; Zhao, H.; Cambria, E. and Fang Zheng, T. (2014). Using Word Sense as a Latent Variable in LDA Can Improve Topic Modeling. In Proceedings of the 6th International Conference on Agents and Artificial Intelligence - Volume 1: ICAART; ISBN 978-989-758-015-4; ISSN 2184-433X, SciTePress, pages 532-537. DOI: 10.5220/0004889705320537

@conference{icaart14,
author={Yunqing Xia. and Guoyu Tang. and Huan Zhao. and Erik Cambria. and Thomas {Fang Zheng}.},
title={Using Word Sense as a Latent Variable in LDA Can Improve Topic Modeling},
booktitle={Proceedings of the 6th International Conference on Agents and Artificial Intelligence - Volume 1: ICAART},
year={2014},
pages={532-537},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004889705320537},
isbn={978-989-758-015-4},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 6th International Conference on Agents and Artificial Intelligence - Volume 1: ICAART
TI - Using Word Sense as a Latent Variable in LDA Can Improve Topic Modeling
SN - 978-989-758-015-4
IS - 2184-433X
AU - Xia, Y.
AU - Tang, G.
AU - Zhao, H.
AU - Cambria, E.
AU - Fang Zheng, T.
PY - 2014
SP - 532
EP - 537
DO - 10.5220/0004889705320537
PB - SciTePress