计算机与现代化 ›› 2024, Vol. 0 ›› Issue (12): 1-9.doi: 10.3969/j.issn.1006-2475.2024.12.001

• 人工智能 •    下一篇

基于意图的轻量级自注意力序列推荐模型



  

  1. (广东工业大学计算机学院,广东 广州 510006)
  • 出版日期:2024-12-31 发布日期:2024-12-31
  • 基金资助:
    广东省重点领域研发计划项目(2021B0101200002); 广东省科技计划项目(2019A050510041)

Intent-based Lightweight Self-Attention Network for Sequential Recommendation

  1. (School of Computer Science and Technology, Guangdong University of Technology, Guangzhou 510006, China)
  • Online:2024-12-31 Published:2024-12-31

摘要: 现有序列推荐模型中自注意力机制计算参数量过大,同时没有充分考虑用户购物意图中的偏好信息。本文提出一种基于意图的轻量级自注意力序列推荐模型。该模型在传统的商品序列编码基础上,引入意图序列编码,进一步挖掘序列间转换模式;同时,为了降低序列中两两商品/意图间自注意力计算复杂度,设计卷积分段采样模块,将用户行为序列和意图序列分为多个片段,即将用户兴趣映射到多个序列片段中,应用自注意力机制捕捉片段间依赖关系,有效减少计算参数量。在MovieLens-1M、Yelp和Amazon-Books这3个公开数据集上进行对比实验,结果表明,相比基线模型,其命中率、归一化折损累计增益和平均倒数排名在MovieLens-1M数据集上提升了5.32%、4.40%和5.51%,在Yelp数据集上提升了30.93%、22.73%和28.84%,在Amazon-Books数据集上提升了7.78%、11.55%和13.98%,充分验证了本文所提模型的有效性。

关键词: 序列推荐, 意图推荐, 卷积神经网络, 自注意力机制

Abstract:  The parameters of the self-attention calculation mechanism in the existing sequence recommendation models are too large, and there is insufficient preference information in the user's shopping intention. This paper proposes an intent-based lightweight self-attention network for sequential recommendation. On the basis of the traditional product sequence embedding, the model introduces intention sequence embedding to further explore the conversion patterns between sequences. At the same time, in order to reduce the computational complexity of self-attention between pairwise products/intentions in the sequence, a convolutional segmentation sampling module is designed to divide the user behavior sequence and intention sequence into multiple segments, mapping user interests to multiple sequence segments. Comparative experiments are conducted on three public datasets, MovieLens-1M, Yelp, and Amazon-Books. Compared with baseline models, the self-attention mechanism is applied to capture the dependency between segments, effectively reducing the number of parameters. The results show that the hit rate, normalized discounted cumulative gain and mean reciprocal ranking are increased by 5.32%, 4.40% and 5.51% on the MovieLens-1M dataset, 30.93%, 22.73% and 28.84% on the Yelp dataset, and 7.78%, 11.55% and 13.98% on the Amazon-Books dataset, which verify the effectiveness of the model proposed in this paper.

Key words: sequence recommendation; intent recommendation; convolution neural network; self-attention mechanisms ,

中图分类号: