To read this content please select one of the options below:

BFFNet: a bidirectional feature fusion network for semantic segmentation of remote sensing objects

Yandong Hou (School of Artificial Intelligence, Henan University, Zhengzhou, China)
Zhengbo Wu (School of Artificial Intelligence, Henan University, Zhengzhou, China)
Xinghua Ren (School of Artificial Intelligence, Henan University, Zhengzhou, China)
Kaiwen Liu (School of Artificial Intelligence, Henan University, Zhengzhou, China)
Zhengquan Chen (School of Artificial Intelligence, Henan University, Zhengzhou, China)

International Journal of Intelligent Computing and Cybernetics

ISSN: 1756-378X

Article publication date: 3 August 2023

Issue publication date: 29 February 2024

109

Abstract

Purpose

High-resolution remote sensing images possess a wealth of semantic information. However, these images often contain objects of different sizes and distributions, which make the semantic segmentation task challenging. In this paper, a bidirectional feature fusion network (BFFNet) is designed to address this challenge, which aims at increasing the accurate recognition of surface objects in order to effectively classify special features.

Design/methodology/approach

There are two main crucial elements in BFFNet. Firstly, the mean-weighted module (MWM) is used to obtain the key features in the main network. Secondly, the proposed polarization enhanced branch network performs feature extraction simultaneously with the main network to obtain different feature information. The authors then fuse these two features in both directions while applying a cross-entropy loss function to monitor the network training process. Finally, BFFNet is validated on two publicly available datasets, Potsdam and Vaihingen.

Findings

In this paper, a quantitative analysis method is used to illustrate that the proposed network achieves superior performance of 2–6%, respectively, compared to other mainstream segmentation networks from experimental results on two datasets. Complete ablation experiments are also conducted to demonstrate the effectiveness of the elements in the network. In summary, BFFNet has proven to be effective in achieving accurate identification of small objects and in reducing the effect of shadows on the segmentation process.

Originality/value

The originality of the paper is the proposal of a BFFNet based on multi-scale and multi-attention strategies to improve the ability to accurately segment high-resolution and complex remote sensing images, especially for small objects and shadow-obscured objects.

Keywords

Acknowledgements

This work was funded by the National Natural Science Foundation of China (No: 61374134) and Postgraduate Cultivating Innovation and Quality Improvement Action Plan of Henan University (No: SYLYC2022081).

Citation

Hou, Y., Wu, Z., Ren, X., Liu, K. and Chen, Z. (2024), "BFFNet: a bidirectional feature fusion network for semantic segmentation of remote sensing objects", International Journal of Intelligent Computing and Cybernetics, Vol. 17 No. 1, pp. 20-37. https://doi.org/10.1108/IJICC-03-2023-0053

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Emerald Publishing Limited

Related articles