Optimizing coagulant dosage using deep learning models with large-scale data

Chemosphere. 2024 Feb:350:140989. doi: 10.1016/j.chemosphere.2023.140989. Epub 2023 Dec 20.

Abstract

Water treatment plants are facing challenges that necessitate transition to automated processes using advanced technologies. This study introduces a novel approach to optimize coagulant dosage in water treatment processes by employing a deep learning model. The study utilized minute-by-minute data monitored in real time over a span of five years, marking the first attempt in drinking water process modeling to leverage such a comprehensive dataset. The deep learning model integrates a one-dimensional convolutional neural network (Conv1D) and gated recurrent unit (GRU) to effectively extract features and model complex time-series data. Initially, the model predicted coagulant dosage and sedimentation basin turbidity, validated against a physicochemical model. Subsequently, the model optimized coagulant dosage in two ways: 1) maintaining sedimentation basin turbidity below the 1.0 NTU guideline, and 2) analyzing changes in sedimentation basin turbidity resulting from reduced coagulant dosage (5-20%). The findings of the study highlight the effectiveness of the deep learning model in optimizing coagulant dosage with substantial reductions in coagulant dosage (approximately 22% reduction and 21 million KRW/year). The results demonstrate the potential of deep learning models in enhancing the efficiency and cost-effectiveness of water treatment processes, ultimately facilitating process automation.

Keywords: Coagulant dosage; Convolutional neural network; Deep learning model; Gated recurrent unit; Optimization.

MeSH terms

  • Deep Learning*
  • Neural Networks, Computer
  • Water Purification* / methods