Computer and Modernization ›› 2025, Vol. 0 ›› Issue (04): 89-95.doi: 10.3969/j.issn.1006-2475.2025.04.014

Previous Articles     Next Articles

Infrared and Visible Image Fusion Based on Twin Axial-attention and Dual-discriminator Generative Adversarial Network

  

  1. (1. School of Information Engineering, Shenyang University of Chemical Technology, Shenyang 110142, China;
    2. Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China)
  • Online:2025-04-30 Published:2025-04-30

Abstract: For the same scene, the fused image of infrared and visible can preserve the thermal radiation information of the foreground target and the background texture details at the same time, and the description is more comprehensive and accurate. However, many classical fusion algorithms based on deep learning usually have the defects of insufficient information retention and unbalanced feature fusion. To solve these problems, an image fusion algorithm based on twin axial-attention and dual-discriminator generating adversarial network is proposed. The generator uses a double-dense convolutional network as a multi-scale feature extractor and introduces spatially enhanced branch and twin axial attention to capture local information and long-range dependencies. The adversarial game between the dual discriminator and the generator is constructed, and the retention degree of differential features is balanced by restricting the similarity between the two source images and the fusion image. The perceptual loss function based on pre-trained VGG19 can overcome the problem of losing high-level features such as semantic-level features. The experimental results on the TNO dataset show that the proposed method achieves prominent fusion results with clear textures and has significant improvements in both subjective and objective evaluation metrics compared to other classical algorithms, demonstrating its advancement.

Key words:  , image fusion, generative adversarial networks, axial-attention module, dual-discriminators, DenseNet

CLC Number: