Researcher(s)
- Wilkin Galindo, Electrical Engineering, University of Delaware
Faculty Mentor(s)
- Kenneth Barner, Electrical and Computer Engineering, University of Delaware
Abstract
Photogrammetry is indispensable for creating detailed 3D models of real-world environments through aerial imagery. However, unwanted objects such as vehicles, construction equipment, or even people often appear in acquired data, degrading the visual quality of the 3D reconstruction. The aerial photogrammetric data are captured and collected using a DJI drone and processed through Agisoft Metashape to create a 3D campus model of the University of Delaware, which students can eventually use as a modern virtual campus map.
The research aims to explore the applications of the Segment Anything Model (SAM), a model developed by Meta AI, and trained with eleven million images, which enables zero-shot generalization, allowing it to segment objects it’s never seen before without retraining, for automated segmenting and masking of extraneous objects in aerial photogrammetric data. The masked regions containing unwanted objects can be artificially filled using inpainting algorithms for 2D data, such as images, to match the surrounding area, making it appear as if the masked object was never there in the first place. With the newly modified images, the 3D reconstruction can be recreated without the unwanted objects, refining the final photogrammetric model.
Through this approach, AI-assistance can be used to refine photogrammetric outputs for mapping and urban planning. Integrating SAM into the photogrammetric pipeline enables the creation of more visually consistent surfaces, free from extraneous objects, resulting in a more accurate digital representation of real-world environments, such as the campus of the University of Delaware. The final 3D model may be used for virtual campus tours, facilitating the transition for students who cannot attend a physical tour, such as international students.