Aerial drone imagery has emerged as a predominant solution for urban 3D modeling, with differentiable rendering techniques being particularly valuable for smart cities and digital twins. Despite their promise, state-of-the-art methods like 3D Gaussian Splatting face GPU memory constraints that preclude city-scale application. To overcome this, we propose a novel divide-and-conquer framework for urban-scale reconstruction from drone images that supports real-time rendering of geometry and texture. Our approach first partitions the UAV imagery into distinct regions via feature matching. Each region undergoes parallelized sparse point cloud reconstruction and pose estimation on CPU, followed by a global integration. We then employ a monocular depth estimation model to derive dense relative depth per view, which is aligned with the sparse point cloud to recover absolute scale. The point cloud is converted into a Gaussian Splatting representation. To address spatial aliasing between adjacent partitions, a dedicated boundary post-processing step enhances rendering accuracy at edges. Furthermore, quantization and compression are applied to minimize the model’s memory footprint for practical storage. Extensive experiments, conducted on both real-world drone-captured datasets and synthetic environments, demonstrate the effectiveness and superiority of our approach, highlighting its potential for large-scale urban scene reconstruction applications.
Low-Altitude UAV Photogrammetry for Large-Scale Scene Reconstruction With Gaussian-Splatting Representation
Yuan Mei,Rui Zeng,Wanting Xu,Xinyue Zhou
Published 2025 in IEEE Access
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
IEEE Access
- Publication date
Unknown publication date
- Fields of study
Computer Science, Engineering, Environmental Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-44 of 44 references · Page 1 of 1
CITED BY
Showing 1-1 of 1 citing papers · Page 1 of 1