计算机与现代化 ›› 2020, Vol. 0 ›› Issue (07): 61-64.doi: 10.3969/j.issn.1006-2475.2020.07.012

• 网络与通信 • 上一篇    下一篇

基于自注意力和胶囊网络的短文本情感分析

  

  1. (中国石油大学(华东)计算机与通信工程学院,山东青岛266580)
  • 出版日期:2020-07-06 发布日期:2020-07-15
  • 作者简介:徐龙(1993-),男,甘肃定西人,硕士研究生,研究方向:自然语言处理,数据挖掘,E-mail: upczyxl@163.com。

Short Text Sentiment Analysis Based on Self-attention and Capsule Network

  1. (College of Computer and Communication Engineering, China University of Petroleum, Qingdao 266580, China)
  • Online:2020-07-06 Published:2020-07-15

摘要: 短文本的情感分析是一项具有挑战性的任务。针对传统的基于卷积神经网络和循环神经网络无法全面获取文本中蕴含的语义信息的缺点,本文提出一种使用多头自注意力层作为特征提取器,再以胶囊网络作为分类层的模型。该模型可以提取丰富的文本信息。在中文文本上进行实验结果表明,与传统深度学习方法相比,本文提出的模型提高了情感分析的精度,在小样本数据集和跨领域迁移中,相比传统方法精度都有较大的提高。

关键词: 情感分析, 自注意力机制, 胶囊网络, 小样本学习, 迁移学习

Abstract: Sentiment analysis of short texts is a challenging task. Aiming at the shortcomings of traditional convolutional neural networks and recurrent neural networks that can not fully obtain the semantic information contained in texts, this paper proposed a model that used the multi-head self-attention layer as the feature extractor and used the capsule network as the classification layer. The model can extract rich text information and has strong expressive ability. Experimental results on Chinese text showed that compared with the traditional deep learning method, the proposed model improved the accuracy of sentiment analysis. In the small dataset and cross-domain migration, compared with traditional method, the accuracy was greatly improved.

Key words: emotional analysis, self-attention mechanism, capsule network, small datasets learning, migration learning

中图分类号: