PanoSynthVR: Toward Light-weight 360-Degree View Synthesis from a Single Panoramic Input

John Waidhofer1
Richa Gadgil2
Anthony Dickson3
Stefanie Zollmann3
Jonathan Ventura1

1California Polytechnic State University, San Luis Obispo, CA, USA
2Carnegie Mellon University, Pittsburgh, PA, USA
3University of Otago, Dunedin, New Zealand


We investigate how real-time, 360 degree view synthesis can be achieved on current virtual reality hardware from a single panoramic image input. We introduce a light-weight method to automatically convert a single panoramic input into a multi-cylinder image representation that supports real-time, free-viewpoint view synthesis rendering for virtual reality. We apply an existing convolutional neural network trained on pinhole images to a cylindrical panorama with wrap padding to ensure agreement between the left and right edges. The network outputs a stack of semi-transparent panoramas at varying depths which can be easily rendered and composited with over blending. Quantitative experiments and a user study show that the method produces convincing parallax and fewer artifacts than a textured mesh representation.

WebXR examples

The WebXR viewer is best experienced in Google Chrome on desktop or on a compatible VR headset.