Knowledge Transfer-Based Sparse Deep Belief Network

IEEE Trans Cybern. 2023 Dec;53(12):7572-7583. doi: 10.1109/TCYB.2022.3173632. Epub 2023 Nov 29.

Abstract

Deep learning has made remarkable achievements in various applications in recent years. With the increasing computing power and the "black box" problem of neural networks, however, the development of deep neural networks (DNNs) has entered a bottleneck period. This article proposes a novel deep belief network (DBN) based on knowledge transfer and optimization of the network structure. First, a neural-symbolic model is proposed to extract rules to describe the dynamic operation mechanism of the deep network. Second, knowledge fusion is proposed based on the merge and deletion of the extracted rules from the DBN model. Finally, a new DNN, knowledge transfer-based sparse DBN (KT-SDBN) is constructed to generate a sparse network without excessive information loss. In comparison with DBN, KT-SDBN has a more sparse network structure and better learning performance on the existing knowledge and data. The experimental results in the benchmark data indicate that KT-SDBN not only has effective feature learning performance with 30% of the original network parameters but also shows a large compression rate that is far larger than other structure optimization algorithms.