HMC: Hybrid model compression method based on layer sensitivity grouping

PLoS One. 2023 Oct 9;18(10):e0292517. doi: 10.1371/journal.pone.0292517. eCollection 2023.

Abstract

Previous studies have shown that deep models are often over-parameterized, and this parameter redundancy makes deep compression possible. The redundancy of model weight is often manifested as low rank and sparsity. Ignoring any part of the two or the different distributions of these two characteristics in the model will lead to low accuracy and a low compression rate of deep compression. To make full use of the difference between low-rank and sparsity, a unified framework combining low-rank tensor decomposition and structured pruning is proposed: a hybrid model compression method based on sensitivity grouping (HMC). This framework unifies the existing additive hybrid compression method (AHC) and the non-additive hybrid compression method (NaHC) proposed by us into one model. The latter group the network according to the sensitivity difference of the convolutional layer to different compression methods, which can better integrate the low rank and sparsity of the model compared with the former. Experiments show that our approach achieves a better trade-off between test accuracy and compression ratio when compressing the ResNet family of models than other recent compression methods using a single strategy or additive hybrid compression.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Data Compression*
  • Physical Phenomena

Grants and funding

1. GL Y;GJJ190450;Jiangxi Provincial Department of Education 2. GL Y;GJJ180484;Jiangxi Provincial Department of Education The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.