No-Reference Quality Assessment for Screen Content Images Using Visual Edge Model and AdaBoosting Neural Network

IEEE Trans Image Process. 2021:30:6801-6814. doi: 10.1109/TIP.2021.3098245. Epub 2021 Jul 30.

Abstract

In this paper, a competitive no-reference metric is proposed to assess the perceptive quality of screen content images (SCIs), which uses the human visual edge model and AdaBoosting neural network. Inspired by the existing theory that the edge information which reflects the visual quality of SCI is effectively captured by the human visual difference of the Gaussian (DOG) model, we compute two types of multi-scale edge maps via the DOG operator firstly. Specifically, two types of edge maps contain contour and edge information respectively. Then after locally normalizing edge maps, L -moments distribution estimation is utilized to fit their DOG coefficients, and the fitted L -moments parameters can be regarded as edge features. Finally, to obtain the final perceptive quality score, we use an AdaBoosting back-propagation neural network (ABPNN) to map the quality-aware features to the perceptual quality score of SCIs. The reason why the ABPNN is regarded as the appropriate approach for the visual quality assessment of SCIs is that we abandon the regression network with a shallow structure, try a regression network with a deep architecture, and achieve a good generalization ability. The proposed method delivers highly competitive performance and shows high consistency with the human visual system (HVS) on the public SCI-oriented databases.