Analysing Touchscreen Gestures: A Study Based on Individuals with Down Syndrome Centred on Design for All

Sensors (Basel). 2021 Feb 13;21(4):1328. doi: 10.3390/s21041328.

Abstract

There has been a conscious shift towards developing increasingly inclusive applications. However, despite this fact, most research has focused on supporting those with visual or hearing impairments and less attention has been paid to cognitive impairments. The purpose of this study is to analyse touch gestures used for touchscreens and identify which gestures are suitable for individuals living with Down syndrome (DS) or other forms of physical or cognitive impairments. With this information, app developers can satisfy Design for All (DfA) requirements by selecting adequate gestures from existing lists of gesture sets. Twenty touch gestures were defined for this study and a sample group containing eighteen individuals with Down syndrome was used. A tool was developed to measure the performance of touch gestures and participants were asked to perform simple tasks that involved the repeated use of these twenty gestures. Three variables are analysed to establish whether they influence the success rates or completion times of gestures, as they could have a collateral effect on the skill with which gestures are performed. These variables are Gender, Type of Down syndrome, and Socioeconomic Status. Analysis reveals that significant difference is present when a pairwise comparison is performed, meaning individuals with DS cannot perform all gestures with the same ease. The variables Gender and Socioeconomic Status do not influence success rates or completion times, but Type of DS does.

Keywords: Down syndrome; UX guidelines; hand gestures; human computer interaction (HCI); user experience; user-centered design.

MeSH terms

  • Attention
  • Down Syndrome*
  • Gestures*
  • Humans
  • Universal Design*