3D Pano Inpainting: Building a VR Environment from a Single Input Panorama

Shivam Ajisa1,  Edward Du1,  Nam Nguyen1,  Stefanie Zollmann2,  Jonathan Ventura1

1California Polytechnic State University, San Luis Obispo, CA, USA
2University of Otago, Dunedin, New Zealand


Our novel system takes as input a single panoramic image, generates a depth map, and applies an inpainting method adapted for 360◦ content to produce a 3D textured mesh for real-time 6DOF view synthesis in a VR headset.

Abstract

Creating 360-degree 3D content is challenging because it requires either a multi-camera rig or a collection of many images taken from different perspectives. Our approach aims to generate a 360◦ VR scene from a single panoramic image using a learning-based inpainting method adapted for panoramic content. We introduce a pipeline capable of transforming an equirectangular panoramic RGB image into a complete 360◦ 3D virtual reality scene represented as a textured mesh, which is easily rendered on a VR headset using standard graphics rendering pipelines. We qualitatively evaluate our results on a synthetic dataset consisting of 360 panoramas in indoor scenes.


Spiral 360-Degree Videos


Real-time VR Interactive Demo

You can explore the output mesh generated by our method using our VR viewer. For the best experience, we recommend using Google Chrome on desktop or a compatible VR headset. Our viewer has been thoroughly tested on an Oculus Quest 2 in tethered mode that allows seamless exploration of walkable spaces. You can also explore the space by swiping with your mouse or using touch.