loading
Papers Papers/2020

Research.Publish.Connect.

Paper

Authors: Michael Adam and Eckehard Steinbach

Affiliation: Chair of Media Technology, Technical University of Munich, Germany

ISBN: 978-989-758-488-6

ISSN: 2184-4321

Keyword(s): Virtual View Generation, Rendering, Deep Learning, Digital Twin, Virtual Reality.

Abstract: As VR-systems become more and more widespread, the interest in high-quality content increases drastically. One way of generating such data is by digitizing real world environments using SLAM-based mapping devices, which capture both the geometry of the environment (typically as point clouds) and its appearance (typically as a small set of RGB panoramas). However, when creating such digital representations of real-world spaces, artifacts and missing data cannot be fully avoided. Furthermore, free movement is often restricted. In this paper, we introduce a technique, which allows for the generation of high quality panoramic views at any position within a captured real world scenario. Our method consists of two steps. First, we render virtual panoramas from the projected point cloud data. Those views exhibit imperfections in comparison to the real panoramas, which can be corrected in the second step by an inpainting neural network. The network itself is trained using a small set of panor amic images captured during the mapping process. In order to take full advantage of the panoramic information, we use a U-Net-like structure with circular convolutions. Further, a custom perceptual panoramic loss is applied. The resulting virtual panoramas show high quality and spatial consistency. Furthermore, the network learns to correct erroneous point cloud data. We evaluate the proposed approach by generating virtual panoramas along novel trajectories where the panorama positions deviate from the originally selected capturing points and observe that the proposed approach is able to generate smooth and temporally consistent walkthrough sequences. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.238.174.50

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Adam, M. and Steinbach, E. (2021). PIU-Net: Generation of Virtual Panoramic Views from Colored Point Clouds. In Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP, ISBN 978-989-758-488-6 ISSN 2184-4321, pages 269-276. DOI: 10.5220/0010198302690276

@conference{visapp21,
author={Michael Adam. and Eckehard Steinbach.},
title={PIU-Net: Generation of Virtual Panoramic Views from Colored Point Clouds},
booktitle={Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP,},
year={2021},
pages={269-276},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010198302690276},
isbn={978-989-758-488-6},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 16th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications - Volume 4: VISAPP,
TI - PIU-Net: Generation of Virtual Panoramic Views from Colored Point Clouds
SN - 978-989-758-488-6
IS - 2184-4321
AU - Adam, M.
AU - Steinbach, E.
PY - 2021
SP - 269
EP - 276
DO - 10.5220/0010198302690276

Login or register to post comments.

Comments on this Paper: Be the first to review this paper.