Reliability, Validity, and User-Experience of Remote Unsupervised Computerized Neuropsychological Assessments in Community-Living 55- to 75-Year-Olds

J Alzheimers Dis. 2022;90(4):1629-1645. doi: 10.3233/JAD-220665.

Abstract

Background: Self-administered computerized neuropsychological assessments (CNAs) provide lower cost, more accessible alternatives to traditional in-person assessments but lack critical information on psychometrics and subjective experience of older adults in remote testing environments.

Objective: We used an online brief battery of computerized tasks selected from the Cogstate Brief Battery (CBB) and Cambridge Brain Sciences (CBS) to 1) determine test-retest reliability in an unsupervised setting; 2) examine convergent validity with a comprehensive 'gold standard' paper-and-pencil neuropsychological test battery administered in-person; and 3) explore user-experience of remote computerized testing and individual tests.

Methods: Fifty-two participants (mean age 65.8±5.7 years) completed CBB and CBS tests on their own computer, unsupervised from home, on three occasions, and visited a research center for an in-person paper-and-pencil assessment. They also completed a user-experience questionnaire.

Results: Test-retest reliabilities varied for individual measures (ICCs = 0.20 to 0.83). Global cognition composites showed excellent reliability (ICCs > 0.8 over 1-month follow-up). A strong relationship between a combination of CNA measures and paper-and-pencil battery was found (canonical correlation R = 0.87, p = 0.04). Most tests were rated as enjoyable with easy-to-understand instructions. Ratings of general experience with online testing were mostly favorable; few had difficulty concentrating (17%) or using the computer for tasks (10%), although over one-third experienced performance anxiety (38%).

Conclusion: A combined brief online battery selected from two CNAs demonstrated robust psychometric standards for reliability (global composite), and convergent validity with a gold standard battery, and mostly good usability and acceptability in the remote testing environment.

Keywords: Healthcare acceptability; mental status and dementia tests; neuropsychological tests; psychometrics.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Aged
  • Cognition*
  • Computers*
  • Humans
  • Neuropsychological Tests
  • Psychometrics
  • Reproducibility of Results