An Ultralow-Power Real-Time Machine Learning Based fNIRS Motion Artifacts Detection

IEEE Trans Very Large Scale Integr VLSI Syst. 2024 Jan 30;32(4):763-773. doi: 10.1109/TVLSI.2024.3356161. eCollection 2024 Apr.

Abstract

Due to iterative matrix multiplications or gradient computations, machine learning modules often require a large amount of processing power and memory. As a result, they are often not feasible for use in wearable devices, which have limited processing power and memory. In this study, we propose an ultralow-power and real-time machine learning-based motion artifact detection module for functional near-infrared spectroscopy (fNIRS) systems. We achieved a high classification accuracy of 97.42%, low field-programmable gate array (FPGA) resource utilization of 38354 lookup tables and 6024 flip-flops, as well as low power consumption of 0.021 W in dynamic power. These results outperform conventional CPU support vector machine (SVM) methods and other state-of-the-art SVM implementations. This study has demonstrated that an FPGA-based fNIRS motion artifact classifier can be exploited while meeting low power and resource constraints, which are crucial in embedded hardware systems while keeping high classification accuracy.

Keywords: Field-programmable gate array (FPGA); functional near-infrared spectroscopy (fNIRS); low power; machine learning; motion artifact detection; real time; support vector machines (SVMs).

Grants and funding

This work was supported in part by the Department of Orthopaedics and Musculoskeletal Science; and in part by the Wellcome Trust and Engineering and Physical Sciences Research Council (EPSRC) through the WEISS Center, UCL under Grant 203145Z/16/Z. The work of Shufan Yang was supported in part by the SHED Project Royal Academy of Engineering under Grant IF2223-172 and in part by the Innovate U.K. KTP under Grant 013191. The work of Hubin Zhao was supported in part by the Royal Society Research under Grant RGS\R2\222333 and in part by the Engineering and Physical Sciences Research Council under Grant 13171178 R00287 and Grant EP/W000679/1.