Implicit detection of user handedness in touchscreen devices through interaction analysis

PeerJ Comput Sci. 2021 Apr 29:7:e487. doi: 10.7717/peerj-cs.487. eCollection 2021.

Abstract

Mobile devices now rival desktop computers as the most popular devices for web surfing and E-commerce. As screen sizes of mobile devices continue to get larger, operating smartphones with a single-hand becomes increasingly difficult. Automatic operating hand detection would enable E-commerce applications to adapt their interfaces to better suit their user's handedness interaction requirements. This paper addresses the problem of identifying the operative hand by avoiding the use of mobile sensors that may pose a problem in terms of battery consumption or distortion due to different calibrations, improving the accuracy of user categorization through an evaluation of different classification strategies. A supervised classifier based on machine learning was constructed to label the operating hand as left or right. The classifier uses features extracted from touch traces such as scrolls and button clicks on a data-set of 174 users. The approach proposed by this paper is not platform-specific and does not rely on access to gyroscopes or accelerometers, widening its applicability to any device with a touchscreen.

Keywords: Accessibility; Customization; Handedness; Machine learning; Stealth data gathering; Usability.

Grants and funding

This work was funded by the Department of Science, Innovation, and Universities (Spain) under the National Program for Research, Development, and Innovation (project RTI2018-099235-B-I00) and the National Science Foundation under grants No. 1458928 and No. 1645025, an REU Site on Ubiquitous Sensing. There was no additional external funding received for this study. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.