Computer and Modernization ›› 2021, Vol. 0 ›› Issue (07): 71-76.
Previous Articles Next Articles
Online:
2021-08-02
Published:
2021-08-02
ZHENG Xin-yue, REN Jun-chao. Intention Recognition and Classification Based on BERT-FNN[J]. Computer and Modernization, 2021, 0(07): 71-76.
[1] | 许坤,冯岩松,赵东岩. 面向知识库的中文自然语言问句的语义理解[J]. 北京大学学报(自然科学版), 2014,50(1):85-92. |
[2] | WANG X J, ZHANG L, MA W Y. Answer ranking in community question-answering sites. US 8346701[P]. 2013-01-01. |
[3] | ANGELINO E, LARUS-STONE N, ALABI D, et al. Learning certifiably optimal rule lists for categorical data[J]. The Journal of Machine Learning Research, 2017,18(1):8753-8830. |
[4] | GREGOROMICHELAKI E, KEMPSON R M, PURVER M, et al. Incrementality and intention-recognition in utterance processing[J]. Dialogue & Discourse, 2011,2(1):199-233. |
[5] | RAMANAND J, BHAVSAR K, PEDANEKAR N. Wishful thinking-finding suggestions and ‘buy’ wishes from product reviews[C]// Proceedings of the NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text. 2010:54-61. |
[6] | ZHANG D L, YAO L N, CHEN K X, et al. Making sense of spatio-temporal preserving representations for EEG-based human intention recognition[J]. IEEE Transactions on Cybernetics, 2019,50(7):3033-3044. |
[7] | 陈浩辰. 基于微博的消费意图挖掘[D]. 哈尔滨:哈尔滨工业大学, 2014. |
[8] | 李超,柴玉梅,南晓斐,等. 基于深度学习的问题分类方法研究[J]. 计算机科学, 2016(12):115-119. |
[9] | KIM Y. Convolutional neural networks for sentence classification[J]. arXiv preprint arXiv:1408.5882, 2014. |
[10] | HASHEMI H B, ASIAEE A, KRAFT R. Query intent detection using convolutional neural networks[C]// Proceedings of the International Conference on Web Search and Data Mining, Workshop on Query Understanding. 2016. DOI: 10.1145/1235. |
[11] | RAVURI S, STOLCKE A. Recurrent neural network and LSTM models for lexical utterance classification[C]// Proceedings of the 16th Annual Conference of the International Speech Communication Association. 2015:1597-1600. |
[12] | DEY R, SALEM F M. Gate-variants of gated recurrent unit (GRU) neural networks[C]// Proceedings of the 2017 IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS). 2017:1597-1600. |
[13] | RAVURI S, STOLCKE A. A comparative study of recurrent neural network models for lexical domain classification[C]// Proceedings of the 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2016:6075-6079. |
[14] | LIN Z H, FENG M W, SANTOS C N D, et al. A structured self-attentive sentence embedding[J]. arXiv preprint arXiv:1703.03130, 2017. |
[15] | CAI R, ZHUB, JI L, et al. An CNN-LSTM attention approach to understanding user query intent from online health communities[C]// Proceedings of the 2017 IEEE International Conference on Data Mining Workshops (ICDMW). 2017:430-437. |
[16] | DEVLIN J, CHANG M W, LEE K, et al. BERT: Pre-training of deep bidirectional transformers for language understanding[J]. arXiv preprint arXiv:1810.04805, 2018. |
[17] | SUN C, QIU X P, XU Y G, et al. How to fine-tune BERT for text classification?[C]// Proceedings of the China National Conference on Chinese Computational Linguistics. 2019:194-206. |
[18] | ETHAYARAJH K. How contextual are contextualized word representations? Comparing the geometry of BERT,ELMo, and GPT-2 embeddings[J]. arXiv preprint arXiv:1909.00512, 2019. |
[19] | TANG M, GANDHI P, KABIR M A, et al. Progress notes classification and keyword extraction using attention-based deep learning models with BERT[J]. arXiv preprint arXiv:1910.05786, 2019. |
[20] | ALSENTZER E, MURPHY J R, BOAG W, et al. Publicly available clinical BERT embeddings[J]. arXiv preprint arXiv:1904.03323, 2019. |
[21] | VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017:6000-6010. |
[22] | 张志昌,张珍文,张治满. 基于IndRNN-Attention的用户意图分类[J]. 计算机研究与发展, 2019,56(7):1517-1524. |
[23] | LIU Y H, OTT M, GOYAL N, et al. RoBERTa: A robustly optimized BERT pretraining approach[J]. arXiv preprint arXiv:1907.11692, 2019. |
[24] | SZE V, CHEN Y H, YANG T J, et al. Efficient processing of deep neural networks: A tutorial and survey[J]. Proceedings of the IEEE. 2017,105(12):2295-2329. |
[1] | ZHENG Li-rui, XIAO Xiao-xia, ZOU Bei-ji, LIU Bin, ZHOU Zhan. Named Entity Recognition in Electronic Medical Record Based on BERT [J]. Computer and Modernization, 2024, 0(01): 87-91. |
[2] | LIU Yu-peng, GE Yan, DU Jun-wei, CHEN Zhuo. Joint Extraction Method of Entities and Relations Based on FGM and Pointer Annotation [J]. Computer and Modernization, 2023, 0(11): 1-5. |
[3] | TANG Shi-qi, ZHOU Rui-ping, XIE Shi-bin, LIU Meng-chi, XIAO Wen, . Cross-language Multi-label Sentiment Classification Based on Stacked Denoising AutoEncoder [J]. Computer and Modernization, 2023, 0(11): 6-12. |
[4] | LI Shi-yue, MENG Jia-na, YU Yu-hai, LI Xue-ying, XU Ying-ao. Aspect Based Sentiment Analysis Model Based on Knowledge Enhancement [J]. Computer and Modernization, 2023, 0(10): 1-8. |
[5] | WANG Hong-jie, XU Sheng-chao. Clustering Method of Cloud Platform Abnormal Transmission Data Based on Hilbert Similarity [J]. Computer and Modernization, 2023, 0(09): 27-31. |
[6] | XIE Shi-chao, HUANG Wei, REN Xiang-hui. A Text Entity Linking Method Based on BERT [J]. Computer and Modernization, 2023, 0(02): 58-61. |
[7] | ZHU Ya-jun, Yong Tso, Nyima Tashi, . Tibetan Medical Entity Recognition Based on Tibetan BERT [J]. Computer and Modernization, 2023, 0(01): 43-48. |
[8] | WANG Hao-chang, LIU Ru-yi. Review of Relation Extraction Based on Pre-training Language Model [J]. Computer and Modernization, 2023, 0(01): 49-57. |
[9] | YU Qing, MA Zhi-long, XU Chun. Medical Knowledge Extraction Based on BERT and Non-autoregressive [J]. Computer and Modernization, 2023, 0(01): 120-126. |
[10] | HUANG Zhong-xiang, LI Ming. Text Classification Based on ALBERT Combined with Bidirectional Network [J]. Computer and Modernization, 2022, 0(10): 8-12. |
[11] | ZHNAG Jun, QIU Long-long. A Text Classification Model Based on BERT and Pooling Operation [J]. Computer and Modernization, 2022, 0(06): 1-7. |
[12] | CHEN Gang. Government Hotline Work-order Classification Fusing RoBERTa and Feature Extraction [J]. Computer and Modernization, 2022, 0(06): 21-26. |
[13] | FAN Hai-wei, QIN Jia-jie, SUN Huan, ZHANG Li-miao, LU Xin-siyu. Traffic Accident Text Information Extraction Model Based on BERT and BiGRU-CRF Fusion [J]. Computer and Modernization, 2022, 0(05): 10-15. |
[14] | GUO Tian-yu, YAN Rong-guo, FANG Xu-chen, XU Yu-ling, TAO Zheng-yi. Detection of R Wave Based on Hilbert Transform and Adaptive Threshold [J]. Computer and Modernization, 2022, 0(02): 114-119. |
[15] | LIU Meng-ying, WANG Yong. Microblog Hot Topic Discovery Based on Text Dual Representation Model [J]. Computer and Modernization, 2021, 0(12): 110-115. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||