Immersive Telepresence and Remote Collaboration using Mobile and Wearable Devices

IEEE Trans Vis Comput Graph. 2019 May;25(5):1908-1918. doi: 10.1109/TVCG.2019.2898737. Epub 2019 Feb 14.

Abstract

The mobility and ubiquity of mobile head-mounted displays make them a promising platform for telepresence research as they allow for spontaneous and remote use cases not possible with stationary hardware. In this work we present a system that provides immersive telepresence and remote collaboration on mobile and wearable devices by building a live spherical panoramic representation of a user's environment that can be viewed in real time by a remote user who can independently choose the viewing direction. The remote user can then interact with this environment as if they were actually there through intuitive gesture-based interaction. Each user can obtain independent views within this environment by rotating their device, and their current field of view is shared to allow for simple coordination of viewpoints. We present several different approaches to create this shared live environment and discuss their implementation details, individual challenges, and performance on modern mobile hardware; by doing so we provide key insights into the design and implementation of next generation mobile telepresence systems, guiding future research in this domain. The results of a preliminary user study confirm the ability of our system to induce the desired sense of presence in its users.

MeSH terms

  • Computer Graphics
  • Computer Systems
  • Gestures
  • Humans
  • Orientation, Spatial
  • Social Behavior
  • User-Computer Interface
  • Videoconferencing
  • Virtual Reality*
  • Wearable Electronic Devices*