[1]李沐,刘树杰,张冬冬,等. 机器翻译[M]. 北京:高等教育出版社, 2018.
[2]NAGAO M. A framework of a mechanical translation between Japanese and English by analogy principle[C]// Proceedings of the International NATO Symposium on Artificial and Human Intelligence. 1984:173-180.
[3]BROWN P F, COCKE J, PIETRA S A D, et al. A statistical approach to machine translation[J]. Computational Linguistics, 2002,16(2):79-87.
[4]BROWN P F, DELLA PIETRA S A, DELLA PIETRA V J, et al. The mathematics of statistical machine translation: Parameter estimation[J]. Computational Linguistics, 1993,19(2):263-311.
[5]KOEHN P, OCH F J, MARCU D. Statistical phrase-based translation[C]// Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology. 2003:48-54.
[6]BAHDANAU D, CHO K, BENGIO Y. Neural machine translation by jointly learning to align and translate[J]. arXiv preprint arXiv:1409.0473, 2016.
[7]SUTSKEVER I, VINYALS O, LE Q V. Sequence to sequence learning with neural networks[C]// Advances in Neural Information Processing Systems. 2014:3104-3112.
[8]冯洋,邵晨泽. 神经机器翻译前沿综述[J]. 中文信息学报, 2020,34(7):1-18.
[9]李亚超,熊德意,张民. 神经机器翻译综述[J]. 计算机学报, 2018,41(12):2734-2755.
[10]肖桐,朱靖波. 机器翻译: 基础与模型[M]. 北京:电子工业出版社, 2021.
[11]GARCIA-MARTINEZ M, BARRAULT L, BOUGARES F. Factored neural machine translation architectures[C]// International Workshop on Spoken Language Translation (IWSLT’16). 2016.
[12]JEAN S, CHO K, MEMISEVIC R, et al. On using very large target vocabulary for neural machine translation[J]. arXiv preprint arXiv:1412.2007, 2014.
[13]SENNRICH R, HADDOW B, BIRCHA. Neural machine translation of rare words with subword units [C]// Proceedings of ACL. 2016.DOI: 10.48550/arXiv.1508.07909
[14]KUDO T. Subword regularization: Improving neural network translation models with multiple subword candidates[J]. arXiv preprint arXiv:1804.10959, 2018.
[15]GAGE P. A new algorithm for data compression[J]. C Users Journal, 1994,12(2):23-38.
[16]WANG C H, CHO K Y, GU J T. Neural machine translation with byte-level subwords[C]// Proceedings of the AAAI Conference on Artificial Intelligence. 2020:9154-9160.
[17]SCHUSTER M, NAKAJIMA K. Japanese and Korean voice search[C]// 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2012:5149-5152.
[18]KUDO T. Subword regularization: Improving neural network translation models with multiple subword candidates[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018. DOI: 10.18653/v1/P18-1007.
[19]薛明亚,余正涛,文永华,等. 融合EMD最小化双语词典的汉—越无监督神经机器翻译[J]. 中文信息学报, 2021,35(3):43-50.
[20]XU J J, ZHOU H, GAN C, et al. Vocabulary learning via optimal transport for neural machine translation[C]// Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing. 2021. DOI: 10.18653/v1/2021.acl-long.571
[21]沙九,冯冲,张天夫,等. 多策略切分粒度的藏汉双向神经机器翻译研究[J]. 厦门大学学报(自然科学版), 2020,59(2):213-219.
[22]WEI J, ZOU K. EDA: Easy data augmentation techniques for boosting performance on text classification tasks[J]. arXiv preprint arXiv:1901.11196, 2019.
[23]LUO R X, XU J J, ZHANG Y, et al. PKUSEG: A toolkit for multi-domain chinese word segmentation[J]. arXiv preprint arXiv:1906.11455, 2019.
[24]李亚超,江静,加羊吉,等. TIP-LAS: 一个开源的藏文分词词性标注系统[J]. 中文信息学报, 2015,29(6):203-207.
[25]VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[C]// Proceedings of the 31st Conference on Neural Information Processing Systems. 2017:6000-6010.
[26]WISEMAN S, RUSH A M. Sequence-to-sequence learning as beam-search optimization[C]// Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016. DOI: 10.18653/v1/D16-1137.
[27]KLEIN G, KIM Y, DENGY, et al. OpenNMT: Open-source toolkit for neural machine translation[C]// Proceedings of ACL, System-Demonstrations. 2017:67-72.
[28]PAPINENI K, ROUKOS S, WARD T, et al. BLEU: A method for automatic evaluation of machine translation[C]// Proceedings of ACL. 2002:311-318.
|