Identifying key challenges and needs in digital mental health moderation practices supporting users exhibiting risk behaviours to develop responsible AI tools: the case study of Kooth

SN Soc Sci. 2022;2(10):217. doi: 10.1007/s43545-022-00532-3. Epub 2022 Sep 29.

Abstract

Digital platforms for mental health and wellbeing purposes have become increasingly common to help users exhibiting risk behaviours (e.g. self-harming, eating-related disorders) across all ages, opening new frontiers in supporting vulnerable users. This study stems from a larger project, which explores how responsible AI solutions can up-scale existing manual moderation approaches and better target interventions for young people who ask for help or engage in risk behaviours online. This research aims to better understand the challenges and needs of moderators and digital counsellors, i.e. the 'behind the scenes'. Through this case study, the authors intend to contribute to the development of responsible AI tools that are fit for purpose and better understand the challenges. The key focus lies on Kooth.com, the UK's leading free online confidential service offering counselling and emotional wellbeing support to young people in the UK through its online web-based and pseudo-anonymous digital platform.

Keywords: Digital counselling; Digital moderation; Mental health and wellbeing; Responsible AI; Risk behaviours.