Computer and Modernization ›› 2023, Vol. 0 ›› Issue (10): 9-16.doi: 10.3969/j.issn.1006-2475.2023.10.002

Previous Articles     Next Articles

Sequence Recommendation Model Based on Dynamic Convolution and Self-attention

  

  1. (School of Computer Science and Technology, Guangdong University of Technology, Guangzhou 510006, China)
  • Online:2023-10-26 Published:2023-10-26

Abstract: Sequence recommendation dynamically models user interests according to the historical interaction records of users and items, and recommends next item. The sequence modeling user interests is usually divided into long-term interest dependency and short-term interest dependency. The existing methods either divide the sequence according to the interaction order, respectively model the long-term and short-term interest dependence, separately model the user interest, or extract the features of the same interactive sequence in parallel with different feature extraction technologies to obtain the global and local interest representation, ignoring the fact that the user intention at different times exists in the behavior context at that time. This paper proposes DConvSA to model dynamic interest by using dynamic convolution and self-attention. Dynamic convolution is used to extract local dynamic interest, and convolution kernel is generated according to different context items to adaptively calculate the importance of items. Combined with the self-attention mechanism, the overall significant item dependency is obtained. At the same time, the global and local interest dependencies at each time are fused in an explicit way to better model the relationship between interests at different times. Experiments are conducted on three public datasets, using recall rate, average reciprocal ranking and normalized cumulative gain for performance evaluation. The results show that the recall rate, mean reciprocal ranking and normalized discounted cumulative gain increased by at least of 1.53%, 3.77% and 3.28% on the MovieLens-1M dataset, 1.86%, 1.94% and 2.46% on the Amazon Beauty dataset, and 0.22%, 0.97% and 1.08% on the Steam dataset.

Key words: Key words: sequence recommendation, dynamic convolution, self-attention, local interest, global interest

CLC Number: