Self-Organizing Nebulous Growths for Robust and Incremental Data Visualization

IEEE Trans Neural Netw Learn Syst. 2021 Oct;32(10):4588-4602. doi: 10.1109/TNNLS.2020.3023941. Epub 2021 Oct 5.

Abstract

Nonparametric dimensionality reduction techniques, such as t-distributed Stochastic Neighbor Embedding (t-SNE) and uniform manifold approximation and projection (UMAP), are proficient in providing visualizations for data sets of fixed sizes. However, they cannot incrementally map and insert new data points into an already provided data visualization. We present self-organizing nebulous growths (SONG), a parametric nonlinear dimensionality reduction technique that supports incremental data visualization, i.e., incremental addition of new data while preserving the structure of the existing visualization. In addition, SONG is capable of handling new data increments, no matter whether they are similar or heterogeneous to the already observed data distribution. We test SONG on a variety of real and simulated data sets. The results show that SONG is superior to Parametric t-SNE, t-SNE, and UMAP in incremental data visualization. Especially, for heterogeneous increments, SONG improves over Parametric t-SNE by 14.98% on the Fashion MNIST data set and 49.73% on the MNIST data set regarding the cluster quality measured by the adjusted mutual information scores. On similar or homogeneous increments, the improvements are 8.36% and 42.26%, respectively. Furthermore, even when the abovementioned data sets are presented all at once, SONG performs better or comparable to UMAP and superior to t-SNE. We also demonstrate that the algorithmic foundations of SONG render it more tolerant to noise compared with UMAP and t-SNE, thus providing greater utility for data with high variance, high mixing of clusters, or noise.