[1] |
MILO T, ZOHAR S. Using schema matching to simplify heterogeneous data translation[C]// Proceedings of the 24th VLDB Conference. 1998:24-27.
|
[2] |
殷章志,李欣子,黄德根,等. 融合字词模型的中文命名实体识别研究[J]. 中文信息学报, 2019,33(11):95-100.
|
[3] |
RAU L F. Extracting company names from text[C]// Proceedings of the 7th IEEE Conference on Artificial Intelligence Application. 1991:29-32.
|
[4] |
COLLINS M, SINGER Y. Unsupervised models for named entity classification[C]// 1999 Joint SIGDAT Conference on Empirical Methods in Natural Language Processing and Very Large Corpora. 1999.
|
[5] |
刘浏,王东波. 命名实体识别研究综述[J]. 情报学报, 2018,37(3):329-340.
|
[6] |
LI J, SUN A X, MA Y K. Neural named entity boundary detection[J]. IEEE Transactions on Knowledge and Data Engineering, 2020,33(4):1790-1795.
|
[7] |
YULITA I N, FANANY M I, ARYMUTHY A M. Bi-directional long short-term memory using quantized data of deep belief networks for sleep stage classification[J]. Procedia Computer Science, 2017,116:530-538.
|
[8] |
JIA C, LIANG X B, ZHANG Y. Cross-domain NER using cross-domain language modeling[C]// Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019:2464-2474.
|
[9] |
LI Y, LONG G D, SHEN T, et al. Self-attention enhanced selective gate with entity-aware embedding for distantly supervised relation extraction[C]// Proceedings of the AAAI Conference on Artificial Intelligence. 2020,34(5):8269-8276.
|
[10] |
CHEN T, XU R F, HE Y L, et al. Improving sentiment analysis via sentence type classification using BiLSTM-CRF and CNN[J]. Expert Systems with Applications, 2017,72:221-230.
|
[11] |
SOUZA F, NOGUEIRA R, LOTUFO R. Portuguese named entity recognition using BERT-CRF[J]. arXiv preprint arXiv:1909.10649, 2019.
|
[12] |
LI C, LIU Y. Improving named entity recognition in tweets via detecting non-standard words[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. 2015:929-938.
|
[13] |
AKBIK A, BERGMANN T, VOLLGRAFR. Pooled contextualized embeddings for named entity recognition[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019:724-728.
|
[14] |
LIU Y J, MENG F D, ZHANG J C, et al. GCDT: A global context enhanced deep transition architecture for sequence labeling[J]. arXiv preprint arXiv:1906.02437, 2019.
|
[15] |
LIU Y H, OTT M, GOYAL N, et al. Roberta: A robustly optimized bert pretraining approach[J]. arXiv preprint arXiv:1907.11692, 2019.
|
[16] |
VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Advances in Neural Information Processing Systems. 2017:5998-6008.
|
[17] |
DEVLIN J, CHANG M W, LEE K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding[J]. arXiv preprint arXiv:1810.04805, 2018.
|
[18] |
ELMAN J L. Finding structure in time[J]. Cognitive Science, 1990,14(2):179-211.
|
[19] |
HUANG Z H, XU W, YU K. Bidirectional LSTM-CRF models for sequence tagging[J]. arXiv preprint arXiv:1508.01991, 2015.
|
[20] |
KALBFLEISCH J D, LAWLESS J F. The analysis of panel count data under a Markov assumption[J]. Journal of the American Statistical Association, 1985,80(392):863-871.
|
[21] |
VITERBI A J, WOLF J K, ZEHAVI E, et al. A pragmatic approach to trellis-coded modulation[J]. IEEE Communications Magazine, 1989,27(7):11-19.
|
[22] |
CAO S S, LU W, ZHOU J, et al. Cw2vec: Learning Chinese word embeddings with stroke n-gram information[C]// The 32nd AAAI Conference on Artificial Intelligence. 2018.
|
[23] |
MIKOLOV T, CHEN K, CORRADO G, et al. Efficient estimation of word representations in vector space[J]. arXiv preprint arXiv:1301.3781, 2013.
|
[24] |
RATINOV L, ROTH D. Design challenges and misconceptions in named entity recognition[C]// Proceedings of the 13th Conference on Computational Natural Language Learning (CoNLL-2009). 2009:147-155.
|
[25] |
LOSHCHILOV I, HUTTER F. Fixing weight decay regularization in adam[J].arXiv preprint arXiv:1711.05101, 2019.
|