[1] Wang Hengsheng, Ren Jin, Li Xiyin. Natural spoken instructions understanding for rescue robot navigation based on cascaded conditional random fields[C]// The 9th International Conference on Human System Interactions(HSI). 2016:216-222.
[2] 李瀚清,房宁,赵群飞,等. 利用深度去噪自编码器深度学习的指令意图理解方法[J]. 上海交通大学学报, 2016,50(7):1102-1107.
[3] 袁树明. 基于自然语言理解的车辆行驶指令抽取[D]. 北京:北京邮电大学, 2013.
[4] Yuan Shuming, Wang Xiaojie. Research on automatic semantic classification of human-interaction instructions[C]// IEEE International Conference on Cloud Computing and Intelligent Systems. 2012:1414-1419.
[5] 高胜男,孔令富,吴培良. 面向室内智能机器人的中文服务指令自主处理方法[J]. 机器人, 2015(4):424-434.
[6] Hinton G E. Learning distributed representations of concepts[C]// Proceedings of the 8th Annual Conference of the Cognitive Science Society. 1986:1-12.
[7] Xu Wei, Rudnicky A I. Can artificial neural networks learn language models?[C]// Proceedings of International Conference on Speech and Language Processing. 2000:202-205.
[8] Bengio Y, Schwenk H, Senécal J S, et al. Neural probabilistic language models[J]. Journal of Machine Learning Research, 2003,3(6):1137-1155.〖JY〗〖HT5”K〗(下转第55页)〖HT〗〖ZK)〗〖FL)〗
[9] Mnih A, Hinton G. A scalable hierarchical distributed language model[C]// Proceedings of the 21st International Conference on Neural Information Processing Systems. 2008:1081-1088.
[10] Mikolov T, Sutskever I, Chen K, et al. Distributed representations of words and phrases and their compositionality[J]. Advances in Neural Information Processing Systems, 2013,26:3111-3119.
[11] Mikolov T, Chen K, Corrado G, et al. Efficient estimation of word representations in vector space[J]. Eprint Arxiv, 2013, arXiv:1301.3781.
[12] Socher R, Pennington J, Huang E H, et al. Semi-supervised recursive autoencoders for predicting sentiment distributions[C]// Proceedings of the Conference on Empirical Methods in Natural Language Processing. 2011:151-161.
[13] Johnson R, Zhang Tong. Effective use of word order for text categorization with convolutional neural networks[J]. Eprint Arxiv, 2014, arXiv:1412.1058.
[14] Lai Siwei, Liu Kang, Xu Liheng, et al. How to generate a good word embedding[J]. Intelligent Systems IEEE, 2016,3(2):1.
[15] Morin F, Bengio Y. Hierarchical probabilistic neural network language model[C]// Proceedings of the 10th International Workshop on Artificial Intelligence and Statistics. 2005:246-252.
[16] Sergienya I, Schütze H. Distributional models and deep learning embeddings: Combining the best of both worlds[J]. Eprint Arxiv, 2013, arXiv:1312.5559.
[17] Bottou L. Stochastic gradient learning in neural networks[C]// Proceedings of Neuro-Nimes. 1991.
[18] Cohen W W, Schapire R E, Singer Y. Learning to order things[J]. Journal of Artificial Intelligence Research, 2011,10(1):243-270.
[19] 张宁. 基于语义的中文文本预处理研究[D]. 西安:西安电子科技大学, 2011.
[20] Cortes C, Vapnik V. Support vector machine[J]. Machine Learning, 1995,20(3):273-297.
|