Digital Integration and Automated Assessment of Eye-Tracking and Emotional Response Data Using the BioSensory App to Maximize Packaging Label Analysis

Sensors (Basel). 2021 Nov 17;21(22):7641. doi: 10.3390/s21227641.

Abstract

New and emerging non-invasive digital tools, such as eye-tracking, facial expression and physiological biometrics, have been implemented to extract more objective sensory responses by panelists from packaging and, specifically, labels. However, integrating these technologies from different company providers and software for data acquisition and analysis makes their practical application difficult for research and the industry. This study proposed a prototype integration between eye tracking and emotional biometrics using the BioSensory computer application for three sample labels: Stevia, Potato chips, and Spaghetti. Multivariate data analyses are presented, showing the integrative analysis approach of the proposed prototype system. Further studies can be conducted with this system and integrating other biometrics available, such as physiological response with heart rate, blood, pressure, and temperature changes analyzed while focusing on different label components or packaging features. By maximizing data extraction from various components of packaging and labels, smart predictive systems can also be implemented, such as machine learning to assess liking and other parameters of interest from the whole package and specific components.

Keywords: areas of interest; computer application; computer vision; eye fixations; sensory analysis.

MeSH terms

  • Emotions
  • Eye-Tracking Technology*
  • Facial Expression
  • Machine Learning
  • Mobile Applications*