Computer and Modernization ›› 2025, Vol. 0 ›› Issue (06): 1-8.doi: 10.3969/j.issn.1006-2475.2025.06.001

    Next Articles

Entity-integrated Summarization Model Based on Improved Graph2Seq 

  

  1. (College of Artificial Intelligence & Automation, Hohai University, Nanjing 211106, China)
  • Online:2025-06-30 Published:2025-07-01

Abstract: Abstract: In order to address the issues caused by high computational resource consumption and limited attention on key named entities, a novel summarization model named Entity-Sparse-Attention Graph-to-Sequence (ESG2S), based on the Graph2Seq model, is proposed in this paper. Firstly, a graph data is created from a syntactic dependency graph enhanced by the extracted entity nodes from the original text. Secondly, this graph data is then input into an encoder to learn the textual structure. Finally, the encoded graph data is fed into an LSTM decoder integrated with Symmetric Divergence-Enhanced Sparse Attention to generate multiple summaries. Experiments on the CNN/DM dataset show that this model outperforms several recent mainstream methods and is effective in preserving entity information, resulting in summaries with better readability and comprehensiveness.

Key words:

Key words: keywords summarization,
Graph2Seq, named entities, sparse attention

CLC Number: