Multiple color representation and fusion for diabetes mellitus diagnosis based on back tongue images

Comput Biol Med. 2023 Mar:155:106652. doi: 10.1016/j.compbiomed.2023.106652. Epub 2023 Feb 14.

Abstract

Tongue images have been proved to be effective in diabetes mellitus (DM) diagnosis. Without requirement of collecting blood sample, tongue image based diagnosis approach is non-invasive and convenient for the patients. Meanwhile, the colors of tongues play an important in aiding accurate diagnosis. However, the tongues' colors fall on a small color gamut that makes it difficult for the existing color descripts to identify and distinguish the tiny difference of the tongues. To tackle this problem, we introduce a novel color descriptor by representing the colors with the clustering centers, namely color centroid points, of the color points sampled from tongue images. In order to boost the capacity of the descriptor, we extend it into three color spaces, i.e., RGB, HSV and LAB to mine a rich set of color information and exploit the complementary information among the three spaces. Since there exist correlation and complementarity among the features extracted from the three color spaces, we propose a novel multiple color features fusion method for DM diagnosis. Particularly, two projections are learned to project the multiple features to their corresponding shared and specific subspaces, in which their similarity and diversity are firstly measured by the Euclidean Distance and Hilbert Schmidt Independence Criterion (HSIC), respectively. To fully exploit the similar and complementary information, the two components are jointly transformed to their label vector, efficiently embedding the discriminant prior into the model, leading to significant improvement in the diagnosis outcomes. Experimental results on clinical tongue dataset substantiated the effectiveness of our proposed clustering-based color descriptor and the proposed multiple colors fusion approach. Overall, the proposed pipeline for the diagnosis of DM using back tongue images, achieved an average accuracy of up to 93.38%, indicating its potential toward realization of a clinical diagnostic tool for DM. Without loss generality, we also assessed the performance of the novel multiple features fusion method on two public datasets. The experiments prove the superiority of our multiple features learning model on general real-life application.

Keywords: Back tongue images; Color descriptor; Diabetes mellitus diagnosis; Hilbert Schmidt Independence Criterion; Multiple features learning; The diversity and similarity.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Cluster Analysis
  • Color
  • Diabetes Mellitus*
  • Humans
  • Tongue*