In light of the critical threat to coral reefs worldwide due to human activity, innovative monitoring strategies are needed that are efficient, standardized, scalable, and economical. This paper presents the results of the first large-scale transnational coral reef surveying endeavor in the Red Sea using DeepReefMap, which provides automatic analysis of video transects by employing neural networks for 3D semantic mapping. DeepReefMap is trained using imagery from low-cost underwater cameras, allowing surveys to be conducted and analyzed in just a few minutes. This initiative was carried out in Djibouti, Jordan, and Israel, with over 184 hours of collected video footage for training the neural network for 3D reconstruction. We created a semantic segmentation dataset of video frames with over 200,000 annotated polygons from 39 benthic classes, down to the resolution of prominent visually identifiable genera found in the Red Sea. We analyzed 365 video transects from 45 sites using the deep-learning based mapping system, demonstrating the method’s robustness across environmental conditions and input video quality. We show that the surveys are consistent in characterizing the benthic composition, therefore showcasing the potential of DeepReefMap for monitoring. This research pioneers deep learning for practical 3D underwater mapping and semantic segmentation, paving the way for affordable, widespread deployment in reef conservation and ecology with tangible impact.
Rapid consistent reef surveys with DeepReefMap
Jonathan Sauder,G. Banc-Prandi,Gabriela Perna,Ibrahim Souleiman Abdallah,Osama S. Saad,M. Mohammed,Ali Al-Sawalmih,Anders Meibom,D. Tuia
Published 2025 in Scientific Reports
ABSTRACT
PUBLICATION RECORD
- Publication year
2025
- Venue
Scientific Reports
- Publication date
2025-11-07
- Fields of study
Medicine, Environmental Science
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-32 of 32 references · Page 1 of 1
CITED BY
Showing 1-1 of 1 citing papers · Page 1 of 1