[1] LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition[C]// Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2016:260-270.
[2] BENGIO Y, DUCHARME R, VINCENT P, et al. A neural probabilistic language model[J]. Journal of Machine Learning Research, 2003,3:1137-1155.
[3] MNIH A, HINTON G. A scalable hierarchical distributed language model[C]// Proceedings of the 21st International Conference on Neural Information Processing Systems. 2008:1081-1088.
[4] CHEN W L, ZHANG Y J, ISAHARA H. Chinese named entity recognition with conditional random fields[C]// Proceedings of the 5th SIGHAN Workshop on Chinese Language Processing. 2006:118-121.
[5] HE J Z, WANG H F. Chinese named entity recognition and word segmentation based on character[C]// Proceedings of the 6th SIGHAN Workshop on Chinese Language Processing. 2008:128-132.
[6] ZHANG Y, YANG J. Chinese NER using lattice LSTM[C]// Proceedings of the 56th Annual Meeting of the Assvciation for Computational Linguistics. 2018:1554-1564.
[7] ZHU Y Y, WANG G X, KARLSSON B. CAN-NER: Convolutional attention network for Chinese named entity recognition[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019:3384-3393.
[8] DING R X, XIE P J, ZHANG X Y, et al. A neural multi-digraph model for Chinese NER with gazetteers[C]// Proceedings of the 57th Annual Meeting of the Assvciation for Computational Linguistics. 2019: 1462-1467.
[9] LIU W, XU T G, XU Q H, et al. An encoding strategy based word-character LSTM for Chinese NER[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019:2379-2389.
[10]HAMMERTON J. Named entity recognition with long short-term memory[C]// Proceedings of the 7th Conference on Natural language Learning at HLT-NAACL. 2003,4:172-175.
[11]COLLOBERT R, WESTON J, BOTTOU L, et al. Natural language processing (almost) from scratch[J]. Journal of Machine Learning Research, 2011,12:2493-2537.
[12]冯艳红,于红,孙庚,等. 基于词向量和条件随机场的领域术语识别方法[J]. 计算机应用, 2016,36(11):3146-3151.
[13]冯艳红,于红,孙庚,等. 基于BLSTM的命名实体识别方法[J]. 计算机科学, 2018,45(2):261-268.
[14]HE K M, ZHANG X Y, REN S Q, et al. Deep residual learning for image recognition[C]// Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition. 2016:770-778.
[15]SCHUSTER M, PALIWAL K K. Bidirectional recurrent neural networks[J]. IEEE Transactions on Signal Processing, 1997,45(11):2673-2681.
[16]GRAVES A, SCHMIDHUBER J. Framewise phoneme classification with bidirectional LSTM and other neural network architectures[J]. Neural Networks, 2005,18(5-6): 602-610.
[17]BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[J]. arXiv:1409.0473, 2014.
[18]MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[J]. arXiv:1301.3781, 2013.
[19]MIKOLOV T, SUTSKEVER I, CHEN K, et al. Distributed representations of words and phrases and their compositionality[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. 2013:3111-3119.
[20]LING W, LUS T, MARUJO L, et al. Finding function in form: Compositional character models for open vocabulary word representation[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015:1520-1530.
[21]DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019:4171-4186.
|