PlumX Metrics
Embed PlumX Metrics

DAFCNN: A Dual-Channel Feature Extraction and Attention Feature Fusion Convolution Neural Network for SAR Image and MS Image Fusion

Remote Sensing, ISSN: 2072-4292, Vol: 15, Issue: 12
2023
  • 2
    Citations
  • 0
    Usage
  • 2
    Captures
  • 2
    Mentions
  • 0
    Social Media
Metric Options:   Counts1 Year3 Year

Metrics Details

  • Citations
    2
  • Captures
    2
  • Mentions
    2
    • Blog Mentions
      1
      • Blog
        1
    • News Mentions
      1
      • News
        1

Most Recent News

Hefei University of Technology Researchers Publish Findings in Remote Sensing (DAFCNN: A Dual-Channel Feature Extraction and Attention Feature Fusion Convolution Neural Network for SAR Image and MS Image Fusion)

2023 JUL 12 (NewsRx) -- By a News Reporter-Staff News Editor at Network Daily News -- Current study results on remote sensing have been published.

Article Description

In the field of image fusion, spatial detail blurring and color distortion appear in synthetic aperture radar (SAR) images and multispectral (MS) during the traditional fusion process due to the difference in sensor imaging mechanisms. To solve this problem, this paper proposes a fusion method for SAR images and MS images based on a convolutional neural network. In order to make use of the spatial information and different scale feature information of high-resolution SAR image, a dual-channel feature extraction module is constructed to obtain a SAR image feature map. In addition, different from the common direct addition strategy, an attention-based feature fusion module is designed to achieve spectral fidelity of the fused images. In order to obtain better spectral and spatial retention ability of the network, an unsupervised joint loss function is designed to train the network. In this paper, the Sentinel 1 SAR images and Landsat 8 MS images are used as datasets for experiments. The experimental results show that the proposed algorithm has better performance in quantitative and visual representation when compared with traditional fusion methods and deep learning algorithms.

Bibliographic Details

Provide Feedback

Have ideas for a new metric? Would you like to see something else here?Let us know