Automatic Video-Oculography System for Detection of Minimal Hepatic Encephalopathy Using Machine Learning Tools

Sensors (Basel). 2023 Sep 25;23(19):8073. doi: 10.3390/s23198073.

Abstract

This article presents an automatic gaze-tracker system to assist in the detection of minimal hepatic encephalopathy by analyzing eye movements with machine learning tools. To record eye movements, we used video-oculography technology and developed automatic feature-extraction software as well as a machine learning algorithm to assist clinicians in the diagnosis. In order to validate the procedure, we selected a sample (n=47) of cirrhotic patients. Approximately half of them were diagnosed with minimal hepatic encephalopathy (MHE), a common neurological impairment in patients with liver disease. By using the actual gold standard, the Psychometric Hepatic Encephalopathy Score battery, PHES, patients were classified into two groups: cirrhotic patients with MHE and those without MHE. Eye movement tests were carried out on all participants. Using classical statistical concepts, we analyzed the significance of 150 eye movement features, and the most relevant (p-values ≤ 0.05) were selected for training machine learning algorithms. To summarize, while the PHES battery is a time-consuming exploration (between 25-40 min per patient), requiring expert training and not amenable to longitudinal analysis, the automatic video oculography is a simple test that takes between 7 and 10 min per patient and has a sensitivity and a specificity of 93%.

Keywords: automatic video-oculography system; brain functionality; diagnosis; machine learning; medical applications.

MeSH terms

  • Hepatic Encephalopathy* / diagnosis
  • Humans
  • Liver Cirrhosis
  • Psychometrics / methods

Grants and funding

This research was partially funded by RoboCity2030-DIH-CM Madrid Robotics Digital Innovation Hub (“Robotica aplicada a la mejora de la calidad de vida de los ciudadanos, Fase IV”; S2018/NMT-4331), funded by Comunidad de Madrid and cofunded by Structural Funds of the EU, and by grants from Ministerio de Ciencia e Innovación (PID2022-136625OB-I00); Universidad de Valencia, Ayudas para Acciones Especiales (UV-INV AE-2633839) to C.M; Agencia Valenciana de Innovación, Generalitat Valenciana (Consolidació Cadena Valor) to C.M.; Consellería Educación, Generalitat Valenciana (CIPROM2021/082) co-funded by European Regional Development Funds (ERDF); F. Sarabia (PRV00225) donation, to CM.