Online Active Learning for Drifting Data Streams

IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):186-200. doi: 10.1109/TNNLS.2021.3091681. Epub 2023 Jan 5.

Abstract

Classification methods for streaming data are not new, but very few current frameworks address all three of the most common problems with these tasks: concept drift, noise, and the exorbitant costs associated with labeling the unlabeled instances in data streams. Motivated by this gap in the field, we developed an active learning framework based on a dual-query strategy and Ebbinghaus's law of human memory cognition. Called CogDQS, the query strategy samples only the most representative instances for manual annotation based on local density and uncertainty, thus significantly reducing the cost of labeling. The policy for discerning drift from noise and replacing outdated instances with new concepts is based on the three criteria of the Ebbinghaus forgetting curve: recall, the fading period, and the memory strength. Simulations comparing CogDQS with baselines on six different data streams containing gradual drift or abrupt drift with and without noise show that our approach produces accurate, stable models with good generalization ability at minimal labeling, storage, and computation costs.