A supervised topic embedding model and its application

PLoS One. 2022 Nov 4;17(11):e0277104. doi: 10.1371/journal.pone.0277104. eCollection 2022.

Abstract

We propose rTopicVec, a supervised topic embedding model that predicts response variables associated with documents by analyzing the text data. Topic modeling leverages document-level word co-occurrence patterns to learn latent topics of each document. While word embedding is a promising text analysis technique in which words are mapped into a low-dimensional continuous semantic space by exploiting the local word co-occurrence patterns within a small context window. Recently developed topic embedding benefits from combining those two approaches by modeling latent topics in a word embedding space. Our proposed rTopicVec and its regularized variant incorporate regression into the topic embedding model to model each document and a numerical label paired with the document jointly. In addition, our models yield topics predictive of the response variables as well as predict response variables for unlabeled documents. We evaluated the effectiveness of our models through experiments on two regression tasks: predicting stock return rates using news articles provided by Thomson Reuters and predicting movie ratings using movie reviews. Results showed that the prediction performance of our models was more accurate in comparison to three baselines with a statistically significant difference.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Semantics*

Grants and funding

K.E. received ROIS NII Open Collaborative Research (Grant Number: 22FS01), provided by the National Institute of Informatics, Research Organization of Information and Systems: <https://www.nii.ac.jp/en/>. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.