Computer and Modernization ›› 2025, Vol. 0 ›› Issue (07): 112-118.doi: 10.3969/j.issn.1006-2475.2025.07.016

Previous Articles     Next Articles

Bert-BiGRU-CRF with Self-attention Fusion for Text Causal Relationship Extraction

  



  1. (School of Computer Science, Xi’an Polytechnic University, Xi’an 710048, China)
  • Online:2025-07-22 Published:2025-07-22

Abstract: Abstract: To address the issues of overlapping relations and long-distance dependencies in causal relation extraction from natural language texts, this paper introduces the tag2triplet algorithm to handle multiple causal triplets within the same sentence and embedded causality. It combines causal labeling schemes with deep learning architectures to minimize feature engineering while effectively modeling causal relationships. Additionally, the paper integrates self-attention mechanisms into the Bert-BiGRU-CRF model to capture long-distance dependencies between causal relations, allowing information to flow freely within the network and thereby more accurately extracting causal relationships. To validate the effectiveness of this approach, the model is compared with the currently widely used BiLSTM-softmax model, BiLSTM-CRF model, and Flair + CLSTM-BiLSTM-CRF model  through experiments on the SemEval 2010 task8 dataset. The results demonstrate that the proposed model achieves a higher F1 score of 83.44%.

Key words: Key words: causal relationship extraction; tag2triplet algorithm; Bert-BiGRU-CRF; self-attention ,

CLC Number: