[1] |
ZHANG L, ZHAO H. Named entity recognition for Chinese microblog with convolutional neural network[C]// 2017 13th International Conference on Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD). IEEE, 2017:87-92.
|
[2] |
陈宇,郑德权,赵铁军. 基于Deep Belief Nets的中文名实体关系抽取[J]. 软件学报, 2012,23(10):2572-2585.
|
[3] |
CUCERZAN S, YAROWSKY D. Language independent named entity recognition combining morphological and contextual evidence[C]// 1999 Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora. 1999:90-99.
|
[4] |
ZHOU G D, SU J. Named entity recognition using an HMM-based chunk tagger[C]// Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics. 2002:473-480.
|
[5] |
LIU X H, ZHANG S D, WEI F R, et al. Recognizing named entities in tweets[C]// Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies. 2011:359-367.
|
[6] |
LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition [C]// Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2016:260-270.
|
[7] |
柏兵,侯霞,石松. 基于CRF和BI-LSTM的命名实体识别方法[J]. 北京信息科技大学学报(自然科学版), 2018,33(6):27-33.
|
[8] |
MA X Z, HOVY E. End-to-end sequence labeling via bi-directional LSTM-CNNS-CRF[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 2016:1064-1074.
|
[9] |
李明扬,孔芳. 融入自注意力机制的社交媒体命名实体识别[J]. 清华大学学报(自然科学版), 2019,59(6):461-467.
|
[10] |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the Advances in Neural Information Processing Systems. 2017:5998-6008.
|
[11] |
秦娅,申国伟,赵文波,等. 基于深度神经网络的网络安全实体识别方法[J]. 南京大学学报(自然科学版), 2019,55(1):29-40.
|
[12] |
SOCHER R, HUVAL B, MANNING C D, et al. Semantic compositionality through recursive matrix-vector spaces[C]// Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. 2012:1201-1211.
|
[13] |
ZENG D J, LIU K, CHEN Y B, et al. Distant supervision for relation extraction via piecewise convolutional neural networks[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015:1753-1762.
|
[14] |
ZHOU P, SHI W, TIAN J, et al. Attention-based bidirectional long short-term memory networks for relation classification[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. 2016:207-212.
|
[15] |
鄂海红,张文静,肖思琪,等. 深度学习实体关系抽取研究综述[J]. 软件学报, 2019,30(6):1793-1818.
|
[16] |
MIWA M, BANSAL M. End-to-end relation extraction using LSTMs on sequences and tree structures[C]// Proceedings of the Meeting of the Association for Computational Linguistics. 2016:1105-1116.
|
[17] |
LI F, ZHANG M S, FU G H, et al. A neural joint model for extracting bacteria and their locations[C]// Pacific-Asia Conference on Knowledge Discovery and Data Mining. Springer, 2017:15-26.
|
[18] |
KATIYAR A, CARDIE C. Going out on a limb: Joint extraction of entity mentions and relations without dependency trees[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics. 2017:917-928.
|
[19] |
ZHENG S C, WANG F, BAO H Y, et al. Joint extraction of entities and relations based on a novel tagging scheme[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics(ACL 2017). 2017:1227-1236.
|
[20] |
ZENG X R, ZENG D J, HE S Z, et al. Extracting relational facts by an end-to-end neural model with copy mechanism[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018:506-514.
|
[21] |
FU T J, LI P H, MA W Y. Graphrel: Modeling text as relational graphs for joint entity and relation extraction[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019:1409-1418.
|
[22] |
ZENG X R, HE S Z, ZENG D J, et al. Learning the extraction order of multiple relational facts in a sentence with reinforcement learning[C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing(EMNLP-IJCNLP). 2019:367-377.
|
[23] |
HANG T T, FENG J, WU Y R, et al. Joint extraction of entities and overlapping relations using source-target entity labeling[J]. Expert Systems with Applications, 2021, 177: 114853.1-114853.15.
|
[24] |
YE H B, ZHANG N Y, DENG S M, et al. Contrastive triple extraction with generative transformer[C]// Proceedings of the AAAI Conference on Artificial Intelligence. 2021:14257-14265.
|
[25] |
WEI Z P, SU J L, WANG Y, et al. A novel cascade binary tagging framework for relational triple extraction[C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020:1476-1488.
|
[26] |
DEVLIN J, CHANG M W, LEE K, et al. BERT: Pretraining of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019:4171-4186.
|
[27] |
GOODFELLOW I J,SHLENS J,SZEGEDY C. Explaining and harnessing adversarial examples[J]. arXiv preprint arXiv:1412.6572, 2014.
|
[28] |
ZENG X R, ZENG D J, HE S E, et al. Extracting relational facts by an end-to-end neural model with copy mechanism[C]// Proceedings of the 56th Annual Meeting of the ACL. 2018:506-514.
|