Compact and Stable Memristive Visual Geometry Group Neural Network

IEEE Trans Neural Netw Learn Syst. 2023 Feb;34(2):987-998. doi: 10.1109/TNNLS.2021.3104860. Epub 2023 Feb 3.

Abstract

As edge computing platforms need low power consumption and small volume circuit with artificial intelligence (AI), we design a compact and stable memristive visual geometry group (MVGG) neural network for image classification. According to characteristics of matrix-vector multiplication (MVM) using memristor crossbars, we design three pruning methods named row pruning, column pruning, and parameter distribution pruning. With a loss of only 0.41% of the classification accuracy, a pruning rate of 36.87% is obtained. In the MVGG circuit, both the batch normalization (BN) layers and dropout layers are combined into the memristive convolutional computing layer for decreasing the computing amount of the memristive neural network. In order to further reduce the influence of multistate conductance of memristors on classification accuracy of MVGG circuit, the layer optimization circuit and the channel optimization circuit are designed in this article. The theoretical analysis shows that the introduction of the optimized methods can greatly reduce the impact of the multistate conductance of memristors on the classification accuracy of MVGG circuits. Circuit simulation experiments show that, for the layer-optimized MVGG circuit, when the number of multistate conductance of memristors is 25 = 32 , the optimized circuit can basically achieve an accuracy of the full-precision MVGG. For the channel-optimized MVGG circuit, when the number of multistate conductance of memristors is 22 = 4 , the optimized circuit can basically achieve an accuracy of the full-precision MVGG.