The precise detection of a surface water body using synthetic aperture radar (SAR) images is crucial for flood mitigation, disaster reduction, and water resource planning applications. Although SAR has been proven to have the ability to provide information for water-body detection, a single SAR feature is still insufficient to achieve high-precision classification. To fully leverage the backscatter intensity and polarimetric features of SAR images, we propose the dual-branch fusion network (DBFNet), an innovative semantic segmentation model that integrates the backscatter intensity and polarimetric features. Specifically, the DBFNet employs a distinctive dual-branch architecture that integrates the complementary information of both feature types using the layer feature fusion module and refines multiscale features at various levels through the intermediate feature refinement module. The performance of the proposed DBFNet is evaluated by conducting comparative experiments with five deep learning models: FCN, U-Net, DeepLabv3+, FWENet, and FFEDN. The experimental results demonstrate that the DBFNet achieves the highest accuracy in water-body detection, with an intersection over union of 89.28% and an |
ACCESS THE FULL ARTICLE
No SPIE Account? Create one
Synthetic aperture radar
Backscatter
Polarimetry
Feature fusion
Image fusion
Data modeling
Semantics