User activity recognition system to improve the performance of environmental control interfaces: a pilot study with patients

J Neuroeng Rehabil. 2019 Jan 16;16(1):10. doi: 10.1186/s12984-018-0477-5.

Abstract

Background: Assistive technologies aim to increase quality of life, reduce dependence on care giver and on the long term care system. Several studies have demonstrated the effectiveness in the use of assistive technology for environment control and communication systems. The progress of brain-computer interfaces (BCI) research together with exoskeleton enable a person with motor impairment to interact with new elements in the environment. This paper aims to evaluate the environment control interface (ECI) developed under the AIDE project conditions, a multimodal interface able to analyze and extract relevant information from the environments as well as from the identification of residual abilities, behaviors, and intentions of the user.

Methods: This study evaluated the ECI in a simulated scenario using a two screen layout: one with the ECI and the other with a simulated home environment, developed for this purpose. The sensorimotor rhythms and the horizontal oculoversion, acquired through BCI2000, a multipurpose standard BCI platform, were used to online control the ECI after the user training and system calibration. Eight subjects with different neurological diseases and spinal cord injury participated in this study. The subjects performed simulated activities of daily living (ADLs), i.e. actions in the simulated environment as drink, switch on a lamp or raise the bed head, during ten minutes in two different modes, AIDE mode, using a prediction model, to recognize the user intention facilitating the scan, and Manual mode, without a prediction model.

Results: The results show that the mean task time spent in the AIDE mode was less than in the Manual, i.e the users were able to perform more tasks in the AIDE mode during the same time. The results showed a statistically significant differences with p<0.001. Regarding the steps, i.e the number of abstraction levels crossed in the ECI to perform an ADL, the users performed one step in the 90% of the tasks using the AIDE mode and three steps, at least, were necessary in the Manual mode. The user's intention prediction was performed through conditional random fields (CRF), with a global accuracy about 87%.

Conclusions: The environment analysis and the identification of the user's behaviors can be used to predict the user intention opening a new paradigm in the design of the ECIs. Although the developed ECI was tested only in a simulated home environment, it can be easily adapted to a real environment increasing the user independence at home.

Keywords: Brain injury; Brain-computer interface; Environment control interface; Multimodal system; Spinal-cord injury; User intention prediction.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Activities of Daily Living
  • Adult
  • Brain-Computer Interfaces*
  • Electroencephalography / methods
  • Electrooculography
  • Exoskeleton Device
  • Female
  • Humans
  • Male
  • Pilot Projects
  • Quality of Life
  • Software*
  • Spinal Cord Injuries*
  • User-Computer Interface