[1] Bengio Y, Dvcharme R, Vincent P, et al. A neural probabilistic language model[J]. Journal of Machine Learning Research, 2003,3(6):1137-1155.〖HJ1.09mm〗
[2] Graves A, Mohamed A R, Hinton G. Speech recognition with deep recurrent neural networks[C]//2013 IEEE International Conference on Acoustics, Speech and Signal Processing. 2013:6645-6649.
[3] Sutskever I, Vinyals O, Le Q V. Sequence to sequence learning with neural networks[C]// Proceedings of the 27th International Conference on Neural Information Processing Systems. 2014:3104-3112.
[4] 〖JP2〗Lecun Y, Bottou L, Bengio Y, et al. Gradient-based learning applied to document recognition[J]. Proceedings of the IEEE, 1998,86(11):2278-2324.
[5] Mikolov T, Sutskever I, Chen Kai, et al. Distributed representations of words and phrases and compositionality[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. 2013:3111-3119.
[6] Lai Siwei, Xu Liheng, Liu Kang, et al. Recurrent Convolutional neural network for text classification[C]// The 29th AAAI Conference on Artifical Intelligence. 2015,333:2267-2273.
[7] Kim Y. Convolutional neural networks for sentence classification[J]. Computer Science, 2014: arXiv:1408.5882.
[8] Mou Lili, Peng Hao, Li Ge, et al. Discriminative neural sentence modeling by tree-based convolution[J]. Computer Science, 2015: arXiv:1504.01106.
[9] Koutnik J, Greff K, Gomez F, et al. A clockwork RNN[L]. Computer Science, 2014: arXiv:1402.3511.
[10]Cho K, Merrienboer B V, Gulcehre C, et al. Learning Phrase representations using RNN encoder-decoder for statistical machine translation[J]. Computer Science, 2014: arXiv:1406.1078.
[11]Luong M T, Pham H, Manning C D. Effective approaches to attention-based neural machine translation[J]. Computer Science, 2015: arXiv:1508.04025.
[12]Bahdanau D, Cho K, Bengio Y. Neural machine translation by jointly learning to align and translate[J]. Computer Science, 2014: arXiv:1409.0473.
[13]Mendes A C, Wichert A. From symbolic to sub-symbolic information in question classification[J]. Artificial Intelligence Review, 2011,35(2):137-154.
[14]Socher R, Perelygin A, Wu J, et al. Recursive deep models for semantic compositionality over a sentiment treebank[C]// Conference on Empirical Methods in Natural Language Processing (EMNLP 2013). 2013:1631-1642.
[15]Collobert R, Weston J, Bottou L, et al. Natural language processing(almost)from scratch[J]. Journal of Machine Learning Research, 2011,12(1):2493-2537.
[16]张冲. 基于Attention-Based LSTM模型的文本分类技术的研究[D]. 南京:南京大学, 2016.
[17]Yan Yan, Yin Xu-Cheng, Li Sujian, et al. Hybrid deep belief network[J]. Computational Intelligence and Neuroscience, 2015(5):650527:1-650527:9.
[18]Zhu Xiaodan, Sobhani P, Guo Hongyu. Learning document Semantic representation with long short-term memory over recursive structures[C]// Proceedings of the 32nd International Conference on International Conference on Machine Learning. 2015:1604-1612. |