2018 I/ITSEC - 9250

Creating a 360-Degree RGB-D Sensor System for Augmented Reality Research (Room S320B)

27 Nov 18
4:30 PM - 5:00 PM
Augmented Reality systems require both localization of the user and mapping of the surrounding area in order to correctly display virtual objects in a manner that is believable to the user. A single sensor can accomplish this with a SLAM algorithm but faces issues if the user performs a significant and quick rotation of the head as features that were being tracked are lost. An omnidirectional camera (360-degree horizontal, near 180-degree vertical) can resolve this, but COTS solutions in this domain only provide RGB information. In this paper we demonstrate a prototype system that fuses the imagery of four RGB-D sensors to create a 360-degree horizontal sensor feed of both color and depth information. We detail the design of the sensor array and challenges faced when attempting to record or visualize the data in real time with each camera’s frame synchronized to other information necessary for future experiments. We also discuss the fusion system used and how this can detect features as a user rotates the sensor array in motions similar to human head movements. In the end, our sensor array shows the potential for quick, COTS-based prototype units that may make use of two or more RGB-D sensors in order to provide accurate localization and mapping of the environment for augmented reality research.