Augmented Reality Based Interactive Cooking Guide

Sensors (Basel). 2022 Oct 28;22(21):8290. doi: 10.3390/s22218290.

Abstract

Cooking at home is a critical survival skill. We propose a new cooking assistance system in which a user only needs to wear an all-in-one augmented reality (AR) headset without having to install any external sensors or devices in the kitchen. Utilizing the built-in camera and cutting-edge computer vision (CV) technology, the user can direct the AR headset to recognize available food ingredients by simply looking at them. Based on the types of the recognized food ingredients, suitable recipes are suggested accordingly. A step-by-step video tutorial providing details of the selected recipe is then displayed with the AR glasses. The user can conveniently interact with the proposed system using eight kinds of natural hand gestures without needing to touch any devices throughout the entire cooking process. Compared with the deep learning models ResNet and ResNeXt, experimental results show that the YOLOv5 achieves lower accuracy for ingredient recognition, but it can locate and classify multiple ingredients in one shot and make the scanning process easier for users. Twenty participants test the prototype system and provide feedback via two questionnaires. Based on the analysis results, 19 of the 20 participants would recommend others to use the proposed system, and all participants are overall satisfied with the prototype system.

Keywords: AR cooking; Magic Leap One; augmented reality; smart kitchen.

MeSH terms

  • Augmented Reality*
  • Cooking
  • Feedback
  • Food Ingredients*
  • Gestures
  • Humans

Substances

  • Food Ingredients

Grants and funding

This research is partially supported under grant number 110-2221-E-259-016 & 111-2221-E-259-012 by the National Science and Technology Council (NSTC), Taiwan.