[1] 李宇明. "一带一路" 需要语言铺路[J]. 中国科技术语, 2015, 17(6):62. LI Y M. "Belt and Road" needs language to pave the way[J]. China Terminology, 2015, 17(6):62.(in Chinese) [2] VASWANI A, SHAZEER N, PARMAR N, et al. Attention is all you need[EB/OL].[2023-09-20]. http://arxiv.org/abs/1706.03762. [3] JOSHI P, SANTY S, BUDHIRAJA A, et al. The state and fate of linguistic diversity and inclusion in the NLP world[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2020:6282-6293. [4] LEE E S, THILLAINATHAN S, NAYAK S, et al. Pre-trained multilingual sequence-to-sequence models:a hope for low-resource language translation?[C]//Proceedings of the Findings of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2022:58-67. [5] ZENG A H, LIU X, DU Z X, et al. GLM-130B:an open bilingual pre-trained model[EB/OL].[2023-09-20]. http://arxiv.org/abs/2210.02414. [6] YU Z Q, YU Z T, XIAN Y T, et al. Improving chinese-vietnamese neural machine translation with linguistic differences[J]. ACM Transactions on Asian and Low-Resource Language Information Processing, 2022, 21(2):22. [7] BATSUKH B E, BEGZ C, SANJAA B. English-Mongolian, Mongolian-English neural machine translation[J]. Asian Journal of Social Science Studies, 2022, 7(3):36. [8] YU Z Q, HUANG Y X, GUO J J. Improving Thai-Lao neural machine translation with similarity lexicon[J]. Journal of Intelligent & Fuzzy Systems, 2022, 42(4):4005-4014. [9] HAN B, WU Y, HU G, et al. Lan-bridge mt's participation in the WMT 2022 general translation shared task[C]//Proceedings of the 7th Conference on Machine Translation.Stroudsburg, USA:Association for Computational Linguistics, 2022:268-274. [10] ZHU J H, XIA Y C, WU L J, et al. Incorporating BERT into neural machine translation[EB/OL].[2023-09-20]. http://arxiv.org/abs/2002.06823. [11] SUN Z W, WANG M X, LI L. Multilingual translation via grafting pre-trained language models[C]//Proceedings of the Findings of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2021:2735-2747. [12] CHEN G H, MA S M, CHEN Y, et al. Towards making the most of cross-lingual transfer for zero-shot neural machine translation[C]//Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2022:142-157. [13] LIU Y H, GU J T, GOYAL N, et al. Multilingual denoising pre-training for neural machine translation[J]. Transactions of the Association for Computational Linguistics, 2020, 8:726-742. [14] XUE L T, CONSTANT N, ROBERTS A, et al. MT5:a massively multilingual pre-trained text-to-text transformer[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. Stroudsburg, USA:Association for Computational Linguistics, 2021:483-498. [15] LI P F, LI L Y, ZHANG M, et al. Universal conditional masked language pre-training for neural machine translation[C]//Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2022:6379-6391. [16] RANATHUNGA S, LEE E S A, SKENDULI M P, et al. Neural machine translation for low-resource languages:a survey[J]. ACM Computing Surveys,2023, 55(11):229. [17] 冯笑, 杨雅婷, 董瑞, 等. 基于回译和集成学习的维汉神经机器翻译方法[J]. 兰州理工大学学报, 2022, 48(5):99-106. FENG X, YANG Y T, DONG R, et al. Uyghur-Chinese neural machine translation method based on back translation and ensemble learning[J]. Journal of Lanzhou University of Technology, 2022, 48(5):99-106.(in Chinese) [18] 宜年, 艾山· 吾买尔, 买合木提· 买买提, 等. 基于多种数据筛选的维汉神经机器翻译[J]. 厦门大学学报(自然科学版), 2022, 61(4):660-666. YI N, Aishan Wumaier, Maihemuti Maimaiti, et al. Uyghur-Chinese neural machine translation system based on multiple data filtering[J]. Journal of Xiamen University (Natural Science), 2022, 61(4):660-666.(in Chinese) [19] LIU X E, HE J S, LIU M Z, et al. A scenario-generic neural machine translation data augmentation method[J]. Electronics, 2023, 12(10):2320. [20] PHAM N L, VAN VINH NGUYEN, PHAM T V. A data augmentation method for english-vietnamese neural machine translation[J]. IEEE Access, 2023, 11:28034-28044. [21] YİRMİBEŞOǦLU Z, GÜNGÖR T. Morphologically motivated input variations and data augmentation in Turkish-English neural machine translation[J]. ACM Transactions on Asian and Low-Resource Language Information Processing, 2023, 22(3):92. [22] KARYUKIN V, RAKHIMOVA D, KARIBAYEVA A, et al. The neural machine translation models for the low-resource Kazakh-English language pair[J]. PeerJ Computer Science, 2023, 9:e1224. [23] LI B, WENG Y X, XIA F, et al. Towards better Chinese-Centric neural machine translation for low-resource languages[EB/OL].[2023-09-20]. http://arxiv.org/abs/2204.04344. [24] LI B, WENG Y X, SUN B, et al. A multi-tasking and multi-stage Chinese minority pre-trained language model[C]//Proceedings of China Conference on Machine Translation. Berlin, Germany:Springer, 2022:93-105. [25] VAN H P, LE THANH H. Improving Khmer-Vietnamese machine translation with data augmentation methods[C]//Proceedings of the 11th International Symposium on Information and Communication Technology. New York, USA:ACM Press, 2022:276-282. [26] TEAM N, COSTA-JUSSÀ M R, CROSS J, et al. No language left behind:scaling human-centered machine translation[EB/OL].[2023-09-20]. http://arxiv.org/abs/2207.04672. [27] WEI J, TAY Y, BOMMASANI R, et al. Emergent abilities of large language models[EB/OL].[2023-09-20]. http://arxiv.org/abs/2206.07682.pdf. [28] BRIAKOU E, CHERRY C, FOSTER G. Searching for needles in a haystack:on the role of incidental bilingualism in PaLM's translation capability[C]//Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2023:9432- 9452. [29] VILAR D, FREITAG M, CHERRY C, et al. Prompting PaLM for translation:assessing strategies and performance[C]//Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2023:15406-15427. [30] PENG K Q, DING L, ZHONG Q H, et al. Towards making the most of ChatGPT for machine translation[EB/OL].[2023-09-20]. http://arxiv.org/abs/2303.13780. [31] GAO Y, WANG R L, HOU F. How to design translation prompts for ChatGPT:an empirical study[EB/OL].[2023-09-20]. http://arxiv.org/abs/2304.02182. [32] MOSLEM Y, HAQUE R, KELLEHER J D, et al. Adaptive machine translation with large language models[EB/OL].[2023-09-20]. http://arxiv.org/abs/2301.13294. [33] ZHU W H, LIU H Y, DONG Q X, et al. Multilingual machine translation with large language models:empirical results and analysis[EB/OL].[2023-09-20]. http://arxiv.org/abs/2304.04675. [34] MU Y Y, REHEMAN A, CAO Z Q, et al. Augmenting large language model translators via translation memories[C]//Proceedings of the Findings of the Association for Computational Linguistics:ACL 2023. Stroudsburg, USA:Association for Computational Linguistics, 2023:10287-10299. [35] WEI J, WANG X Z, SCHUURMANS D, et al. Chain-of-thought prompting elicits reasoning in large language models[EB/OL].[2023-09-20]. http://arxiv.org/abs/2201.11903.pdf. [36] ZHONG Q H, DING L, LIU J H, et al. Can ChatGPT understand too? A comparative study on ChatGPT and fine-tuned BERT[EB/OL].[2023-09-20]. http://arxiv.org/abs/2302.10198. [37] TAN Z X, ZHANG X W, WANG S, et al. MSP:multi-stage prompting for making pre-trained language models better translators[C]//Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2022:6131-6142. [38] JIAO W X, WANG W X, HUANG J T, et al. Is ChatGPT A good translator? yes with GPT-4 as the engine[EB/OL].[2023-09-20]. http://arxiv.org/abs/2301.08745. [39] HUANG X S, CHEN Y B, WU S, et al. Named entity recognition via noise aware training mechanism with data filter[C]//Proceedings of the Findings of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2021:4791-4803. [40] SENNRICH R, HADDOW B, BIRCH A. Improving neural machine translation models with monolingual data[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA:Association for Computational Linguistics, 2016:86-96. [41] HU E J, SHEN Y L, WALLIS P, et al. LoRA:low-rank adaptation of large language models[EB/OL].[2023-09-20]. http://arxiv.org/abs/2106.09685.pdf. [42] PAPINENI K, ROUKOS S, WARD T, et al. BLEU:a method for automatic evaluation of machine translation[C]//Proceedings of the 40th Annual Meeting on Association for Computational Linguistics. New York, USA:ACM Press, 2002:311-318. [43] POPOVI AC'G M. chrF++:words helping character n-grams[C]//Proceedings of the 2nd Conference on Machine Translation. Stroudsburg, USA:Association for Computational Linguistics, 2017:612-618. |