Computer and Modernization ›› 2024, Vol. 0 ›› Issue (12): 1-9.doi: 10.3969/j.issn.1006-2475.2024.12.001

    Next Articles

Intent-based Lightweight Self-Attention Network for Sequential Recommendation

  

  1. (School of Computer Science and Technology, Guangdong University of Technology, Guangzhou 510006, China)
  • Online:2024-12-31 Published:2024-12-31

Abstract:  The parameters of the self-attention calculation mechanism in the existing sequence recommendation models are too large, and there is insufficient preference information in the user's shopping intention. This paper proposes an intent-based lightweight self-attention network for sequential recommendation. On the basis of the traditional product sequence embedding, the model introduces intention sequence embedding to further explore the conversion patterns between sequences. At the same time, in order to reduce the computational complexity of self-attention between pairwise products/intentions in the sequence, a convolutional segmentation sampling module is designed to divide the user behavior sequence and intention sequence into multiple segments, mapping user interests to multiple sequence segments. Comparative experiments are conducted on three public datasets, MovieLens-1M, Yelp, and Amazon-Books. Compared with baseline models, the self-attention mechanism is applied to capture the dependency between segments, effectively reducing the number of parameters. The results show that the hit rate, normalized discounted cumulative gain and mean reciprocal ranking are increased by 5.32%, 4.40% and 5.51% on the MovieLens-1M dataset, 30.93%, 22.73% and 28.84% on the Yelp dataset, and 7.78%, 11.55% and 13.98% on the Amazon-Books dataset, which verify the effectiveness of the model proposed in this paper.

Key words: sequence recommendation; intent recommendation; convolution neural network; self-attention mechanisms ,

CLC Number: