Up to now, there was no way to observe and track the affective impacts of the massive amount of complex visual stimuli that people encounter "in the wild" during their many hours of digital life. In this paper, we propose and illustrate how recent advances in AI-trained ensembles of deep neural networks-can be deployed on new data streams that are long sequences of screenshots of study participants' smartphones obtained unobtrusively during everyday life. We obtained affective valence and arousal ratings of hundreds of images drawn from existing picture repositories often used in psychological studies, and a new screenshot repository chronicling individuals' everyday digital life from both N = 832 adults and an affect computation model (Parry & Vuong, 2021). Results and analysis suggest that (a) our sample rates images similarly to other samples used in psychological studies, (b) the affect computation model is able to assign valence and arousal ratings similarly to humans, and (c) the resulting computational pipeline can be deployed at scale to obtain detailed maps of the affective space individuals travel through on their smartphones. Leveraging innovative methods for tracking the emotional content individuals encounter on their smartphones, we open the possibility for large-scale studies of how the affective dynamics of everyday digital life shape individuals' moment-to-moment experiences and well-being.
Supplementary information: The online version contains supplementary material available at 10.1007/s42761-023-00202-4.
Keywords: Intraindividual variability; Longitudinal; Machine learning; Media effects; Screenomics.
© The Society for Affective Science 2023. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.