Remote sensing technology in recent years has been regarded the most important source to provide substantial information for delineating the flooding extent to the disaster management authority. There have been numerous studies proposing mathematical or statistical classification models for flood mapping. However, conventional pixel-wise classifications methods rely on the exact match of the spectral signature to label the target pixel. In this study, we propose a fully convolutional neural networks (F-CNNs) classification model to map the flooding extent from Landsat satellite images. We utilised the spatial information from the neighbouring area of target pixel in classification. A total of 64 different models were generated and trained with a variable neighbourhood size of training samples and number of learnable filters. The training results revealed that the model trained with 3 × 3 neighbourhood sized training samples and with 32 convolutional filters achieved the best performance out of the experiments. A new set of different Landsat images covering flooded areas across Australia were used to evaluate the classification performance of the model. A comparison of our proposed classification model to the conventional support vector machines (SVM) classification model shows that the F-CNNs model was able to detect flooded areas more efficiently than the SVM classification model. For example, the F-CNNs model achieved a maximum precision rate (true positives) of 76.7% compared to 45.27% for SVM classification.
Flood Mapping with Convolutional Neural Networks Using Spatio-Contextual Pixel Information
Chandrama Sarker,Luis Mejías Alvarez,F. Maire,A. Woodley
Published 2019 in Remote Sensing
ABSTRACT
PUBLICATION RECORD
- Publication year
2019
- Venue
Remote Sensing
- Publication date
2019-10-08
- Fields of study
Geology, Computer Science, Engineering, Environmental Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-44 of 44 references · Page 1 of 1
CITED BY
Showing 1-62 of 62 citing papers · Page 1 of 1