A multimodal framework based on deep belief network for human locomotion intent prediction

Biomed Eng Lett. 2024 Feb 9;14(3):559-569. doi: 10.1007/s13534-024-00351-w. eCollection 2024 May.

Abstract

Accurate prediction of human locomotion intent benefits the seamless switching of lower limb exoskeleton controllers in different terrains to assist humans in walking safely. In this paper, a deep belief network (DBN) was developed to construct a multimodal framework for recognizing various locomotion modes and predicting transition tasks. Three fusion strategies (data level, feature level, and decision level) were explored, and optimal network performance was obtained. This method could be tested on public datasets. For the continuous performance of steady state, the best prediction accuracy achieved was 97.64% in user-dependent testing and 96.80% in user-independent testing. During the transition state, the system accurately predicted all transitions (user-dependent: 96.37%, user-independent: 95.01%). The multimodal framework based on DBN can accurately predict the human locomotion intent. The experimental results demonstrate the potential of the proposed model in the volition control of the lower limb exoskeleton.

Keywords: Deep belief network; Deep learning; Exoskeletons; Locomotion intent prediction; Multimodal fusion.