Computer and Modernization ›› 2021, Vol. 0 ›› Issue (12): 79-84.

Previous Articles     Next Articles

Deep Connected Ultra-lightweight Subspace Attention Mechanism

  

  1. (School of Information and Electronic Engineering, Zhejiang Gongshang University, Hangzhou 310018, China)
  • Online:2021-12-24 Published:2021-12-24

Abstract: In order to solve the problem of large computation or parameter overheads in deploying the existing attention mechanism of compact convolutional neural network, an improved ultra-lightweight subspace attention mechanism is proposed. Firstly, the deep connected subspace attention mechanism(DCSAM) is used to divide the feature map into several feature subspaces, and deduce different attention feature maps for each feature subspace. Secondly, the spatial calibration method of feature subspace is improved. Finally, the connection between the front and back feature subspaces is established to make the information flow between the front and back feature subspaces. The subspace attention mechanism enables multi-scale and multi-frequency feature representation, which is more suitable for fine-grained image classification. The method is orthogonal and complementary to the existing attention mechanisms used in visual models. The experimental results show that on ImageNet-1K and Stanford Cars datasets, the highest accuracy of MobileNetV2 is improved about 0.48 and 2 percent points when the number of parameters and floating-point operations are reduced by 12% and 24% respectively.

Key words: compact, attention mechanism, deep connection, feature subspace