|国家预印本平台
首页|Rendering Anywhere You See: Renderability Field-guided Gaussian Splatting

Rendering Anywhere You See: Renderability Field-guided Gaussian Splatting

Rendering Anywhere You See: Renderability Field-guided Gaussian Splatting

来源:Arxiv_logoArxiv
英文摘要

Scene view synthesis, which generates novel views from limited perspectives, is increasingly vital for applications like virtual reality, augmented reality, and robotics. Unlike object-based tasks, such as generating 360{\deg} views of a car, scene view synthesis handles entire environments where non-uniform observations pose unique challenges for stable rendering quality. To address this issue, we propose a novel approach: renderability field-guided gaussian splatting (RF-GS). This method quantifies input inhomogeneity through a renderability field, guiding pseudo-view sampling to enhanced visual consistency. To ensure the quality of wide-baseline pseudo-views, we train an image restoration model to map point projections to visible-light styles. Additionally, our validated hybrid data optimization strategy effectively fuses information of pseudo-view angles and source view textures. Comparative experiments on simulated and real-world data show that our method outperforms existing approaches in rendering stability.

计算技术、计算机技术

.Rendering Anywhere You See: Renderability Field-guided Gaussian Splatting[EB/OL].(2025-04-27)[2025-05-09].https://arxiv.org/abs/2504.19261.点此复制

评论