LivePhantom: Retrieving Virtual World Light Data to Real Environments

PLoS One. 2016 Dec 8;11(12):e0166424. doi: 10.1371/journal.pone.0166424. eCollection 2016.

Abstract

To achieve realistic Augmented Reality (AR), shadows play an important role in creating a 3D impression of a scene. Casting virtual shadows on real and virtual objects is one of the topics of research being conducted in this area. In this paper, we propose a new method for creating complex AR indoor scenes using real time depth detection to exert virtual shadows on virtual and real environments. A Kinect camera was used to produce a depth map for the physical scene mixing into a single real-time transparent tacit surface. Once this is created, the camera's position can be tracked from the reconstructed 3D scene. Real objects are represented by virtual object phantoms in the AR scene enabling users holding a webcam and a standard Kinect camera to capture and reconstruct environments simultaneously. The tracking capability of the algorithm is shown and the findings are assessed drawing upon qualitative and quantitative methods making comparisons with previous AR phantom generation applications. The results demonstrate the robustness of the technique for realistic indoor rendering in AR systems.

MeSH terms

  • Algorithms
  • Environment
  • Image Processing, Computer-Assisted
  • Imaging, Three-Dimensional / methods
  • Light
  • User-Computer Interface*
  • Video Recording

Grants and funding

This paper is supported by PDRU grant vote: Q.J130000.21A2.03E19 at MaGICX UTM-IRDA Universiti Teknologi Malaysia.