2017 article

A Lifelong Learning Topic Model Structured Using Latent Embeddings

2017 11TH IEEE INTERNATIONAL CONFERENCE ON SEMANTIC COMPUTING (ICSC), pp. 260–261.

By: M. Xu n, R. Yang n, S. Harenberg n & N. Samatova n

author keywords: Lifelong learning; Topic modeling; Latent embeddings
TL;DR: A latent-embedding-structured lifelong learning topic model, called the LLT model, to discover coherent topics from a corpus and exploit latent word embeddings to structure the model and mine word correlation knowledge to assist in topic modeling. (via Semantic Scholar)
Source: Web Of Science
Added: August 6, 2018

We propose a latent-embedding-structured lifelong learning topic model, called the LLT model, to discover coherent topics from a corpus. Specifically, we exploit latent word embeddings to structure our model and mine word correlation knowledge to assist in topic modeling. During each learning iteration, our model learns new word embeddings based on the topics generated in the previous learning iteration. Experimental results demonstrate that our LLT model is able to generate more coherent topics than state-of-the-art methods.