Researcher(s)
- Logan Blackburn, Computer Engineering, University of Delaware
Faculty Mentor(s)
- Kenneth Barner, Electrical and Computer Engineering, University of Delaware
Abstract
We investigate the adaptation of 3D Gaussian Splatting (3DGS) to enable photorealistic scene reconstruction from drone-captured imagery. With recent works, 3DGS has shown strong performance in novel view synthesis using multi-view 2D images as inputs to produce 3D reconstructions of scenes. These results are based largely on small-scale synthetic scenes with dense and comprehensive viewpoint coverage. However, applying 3DGS to large-scale, real-world environments presents significant challenges due to the sparse and non-uniform image coverage typical in aerial drone mapping. These image sets are constrained by logistical challenges in capturing ground-level and low-altitude views, often due to physical obstacles and safety considerations. To address these challenges, we analyze the performance of 3DGS on aerial imagery datasets at a fixed altitude characterized by non-ideal conditions, introducing limited viewpoint diversity and wide scene coverage. Our experiments reveal that insufficient coverage leads to significantly degraded scene quality, manifesting as geometric distortions, texture artifacts, and reconstruction noise. We evaluate the effects of tuning key hyperparameters, specifically those governing point densification and pruning strategies, to mitigate these artifacts and improve reconstruction fidelity. Our findings highlight the limitations of current 3DGS models under sparse aerial input conditions and demonstrate potential pathways for extending 3DGS to large-scale drone-based mapping. This work establishes a foundation for future research in photorealistic 3D reconstruction of expansive outdoor environments for applications such as virtual walkthroughs, digital twins, and urban planning.