Explainable hybrid word representations for sentiment analysis of financial news

Neural Netw. 2023 Jul:164:115-123. doi: 10.1016/j.neunet.2023.04.011. Epub 2023 Apr 21.

Abstract

Due to the increasing interest of people in the stock and financial market, the sentiment analysis of news and texts related to the sector is of utmost importance. This helps the potential investors in deciding what company to invest in and what are their long-term benefits. However, it is challenging to analyze the sentiments of texts related to the financial domain, given the enormous amount of information available. The existing approaches are unable to capture complex attributes of language such as word usage, including semantics and syntax throughout the context, and polysemy in the context. Further, these approaches failed to interpret the models' predictability, which is obscure to humans. Models' interpretability to justify the predictions has remained largely unexplored and has become important to engender users' trust in the predictions by providing insight into the model prediction. Accordingly, in this paper, we present an explainable hybrid word representation that first augments the data to address the class imbalance issue and then integrates three embeddings to involve polysemy in context, semantics, and syntax in a context. We then fed our proposed word representation to a convolutional neural network (CNN) with attention to capture the sentiment. The experimental results show that our model outperforms several baselines of both classic classifiers and combinations of various word embedding models in the sentiment analysis of financial news. The experimental results also show that the proposed model outperforms several baselines of word embeddings and contextual embeddings when they are separately fed to a neural network model. Further, we show the explainability of the proposed method by presenting the visualization results to explain the reason for a prediction in the sentiment analysis of financial news.

Keywords: Contextual embeddings; Explainability; Explainable sentiment analysis; Hybrid word embeddings; Natural Language Processing; XAI.

MeSH terms

  • Humans
  • Language
  • Natural Language Processing
  • Neural Networks, Computer
  • Semantics*
  • Sentiment Analysis*