计算机与现代化 ›› 2023, Vol. 0 ›› Issue (04): 26-31.

• 人工智能 • 上一篇    下一篇

基于BERT-BiLSTM-Attention混合模型的事件抽取方法

  

  1. (四川大学电子信息学院,四川 成都 610065)
  • 出版日期:2023-05-09 发布日期:2023-05-09
  • 作者简介:魏鑫(1997—),女,重庆巫山人,硕士研究生,研究方向:自然语言处理,计算机视觉,E-mail: 158426422@qq.com; 通信作者:何小海(1964—),男,四川绵阳人,教授,博士生导师,博士,研究方向:图像处理,模式识别,图像通信,E-mail: 571730621@qq.com; 滕奇志(1961—),女,四川成都人,教授,博士,研究方向:数字图像处理,模式识别,三维图像重建及分析,计算机应用,E-mail: qzteng@scu.edu.cn; 卿粼波(1982—),男,四川简阳人,副教授,硕士生导师,博士,研究方向:图像处理,模式识别,视频通信,E-mail: qing_lb@scu.edu.cn; 陈洪刚(1991—),男,四川成都人,助理研究员,博士,研究方向:图像/视频理解、复原及压缩编码,E-mail: honggang_chen@scu.edu.cn。
  • 基金资助:
    成都市重大科技应用示范项目(2019-YF09-00120-SN)

Event Extraction Method Based on BERT-BiLSTM-Attention Hybrid Model

  1. (College of Electronics and Information Engineering, Sichuan University, Chengdu 610065, China)
  • Online:2023-05-09 Published:2023-05-09

摘要: 事件抽取是信息抽取领域中的一项基本任务,旨在从非结构化文本中将结构化信息提取出来。现有基于机器阅读理解模型的事件抽取方法大多数直接对输入文本进行触发词识别检测与分类,在一定程度上忽视了由判断输入文本是否为事件而带来的预测误差,因此,提出一种基于BERT-BiLSTM-Attention混合模型的事件抽取方法。该方法用基于BERT的机器阅读理解模型为基本框架,采用多轮问答的方式,在现有机器阅读理解模型的基础上添加事件分类检测模块,使得模型能够减少预测误差,采用BiLSTM模型与注意力机制相结合组成历史会话信息模块,更有效地将重要信息筛选出来并融合到阅读理解模型中去。在公开数据集ACE2005上进行事件抽取实验,结果表明,准确率、召回率和F1值较基础模型分别提升7.8个百分点、4.6个百分点和5.4个百分点,具有一定的优势。

关键词: 事件抽取, 机器阅读理解, 事件分类, BiLSTM, 注意力机制

Abstract: Event extraction is one of the basic tasks in the information extraction’s field, which is aims to extract structured information from unstructured text. The majority of the existing event extraction methods which are based on machine reading comprehension model directly detect and classify the input text trigger words, and to some extent ignore the prediction error caused by judging whether the input text is an event. Therefore, this paper proposes an event extraction method based on BERT-BiLSTM-Attention hybrid model. This method takes BERT-based machine reading comprehension model as the basic model, adopts multi-round question-and-answer method, and adds event classification detection module on the basis of existing machine reading comprehension model to reduce prediction error. BiLSTM model is combined with attention mechanism to form historical session information module to more effectively filter out important information and integrate it into a reading comprehension model. The event extraction experiments are conducted on ACE2005, and the results show that the accuracy, recall and F1 value are improved by 7.8 percentage points, 4.6 percentage points and 5.4 percentage points, respectively, compared with the basic model, which has certain advantages.

Key words: event extraction, machine reading comprehension, event classification, BiLSTM, attention mechanism