Authors:
Yizhou Li
1
;
1
;
Masatoshi Okutomi
1
;
Yuuichi Tanaka
2
;
Seiichi Kataoka
3
and
Teruaki Kosiba
4
Affiliations:
1
Institute of Science Tokyo, Tokyo, Japan
;
2
Micware Mobility Co., Ltd., Hyogo, Japan
;
3
Micware Automotive Co., Ltd., Hyogo, Japan
;
4
Micware Navigations Co., Ltd, Hyogo, Japan
Keyword(s):
Neural Radiance Fields (NeRF), Novel View Synthesis, Street Views, Urban Scenes.
Abstract:
Recent advances in Neural Radiance Fields (NeRF) have shown great potential in 3D reconstruction and novel view synthesis, particularly for indoor and small-scale scenes. However, extending NeRF to large-scale outdoor environments presents challenges such as transient objects, sparse cameras and textures, and varying lighting conditions. In this paper, we propose a segmentation-guided enhancement to NeRF for outdoor street scenes, focusing on complex urban environments. Our approach extends ZipNeRF and utilizes Grounded SAM for segmentation mask generation, enabling effective handling of transient objects, modeling of the sky, and regularization of the ground. We also introduce appearance embeddings to adapt to inconsistent lighting across view sequences. Experimental results demonstrate that our method outperforms the baseline ZipNeRF, improving novel view synthesis quality with fewer artifacts and sharper details.