1 |
李宇明. "一带一路" 需要语言铺路. 中国科技术语, 2015, 17(6): 62.
URL
|
|
LI Y M. "Belt and Road" needs language to pave the way. China Terminology, 2015, 17(6): 62.
URL
|
2 |
|
3 |
JOSHI P, SANTY S, BUDHIRAJA A, et al. The state and fate of linguistic diversity and inclusion in the NLP world[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2020: 6282-6293.
|
4 |
LEE E S, THILLAINATHAN S, NAYAK S, et al. Pre-trained multilingual sequence-to-sequence models: a hope for low-resource language translation? [C]//Proceedings of the Findings of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2022: 58-67.
|
5 |
|
6 |
YU Z Q, YU Z T, XIAN Y T, et al. Improving chinese-vietnamese neural machine translation with linguistic differences. ACM Transactions on Asian and Low-Resource Language Information Processing, 2022, 21(2): 22.
|
7 |
BATSUKH B E, BEGZ C, SANJAA B. English-Mongolian, Mongolian-English neural machine translation. Asian Journal of Social Science Studies, 2022, 7(3): 36.
doi: 10.20849/ajsss.v7i3.999
|
8 |
YU Z Q, HUANG Y X, GUO J J. Improving Thai-Lao neural machine translation with similarity lexicon. Journal of Intelligent & Fuzzy Systems, 2022, 42(4): 4005- 4014.
|
9 |
HAN B, WU Y, HU G, et al. Lan-bridge mt's participation in the WMT 2022 general translation shared task[C]//Proceedings of the 7th Conference on Machine Translation. Stroudsburg, USA: Association for Computational Linguistics, 2022: 268-274.
|
10 |
|
11 |
SUN Z W, WANG M X, LI L. Multilingual translation via grafting pre-trained language models[C]//Proceedings of the Findings of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2021: 2735-2747.
|
12 |
CHEN G H, MA S M, CHEN Y, et al. Towards making the most of cross-lingual transfer for zero-shot neural machine translation[C]//Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2022: 142-157.
|
13 |
LIU Y H, GU J T, GOYAL N, et al. Multilingual denoising pre-training for neural machine translation. Transactions of the Association for Computational Linguistics, 2020, 8, 726- 742.
doi: 10.1162/tacl_a_00343
|
14 |
XUE L T, CONSTANT N, ROBERTS A, et al. MT5: a massively multilingual pre-trained text-to-text transformer[C]//Proceedings of the Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Stroudsburg, USA: Association for Computational Linguistics, 2021: 483-498.
|
15 |
LI P F, LI L Y, ZHANG M, et al. Universal conditional masked language pre-training for neural machine translation[C]//Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2022: 6379-6391.
|
16 |
RANATHUNGA S, LEE E S A, SKENDULI M P, et al. Neural machine translation for low-resource languages: a survey. ACM Computing Surveys, 2023, 55(11): 229.
|
17 |
冯笑, 杨雅婷, 董瑞, 等. 基于回译和集成学习的维汉神经机器翻译方法. 兰州理工大学学报, 2022, 48(5): 99- 106.
URL
|
|
FENG X, YANG Y T, DONG R, et al. Uyghur-Chinese neural machine translation method based on back translation and ensemble learning. Journal of Lanzhou University of Technology, 2022, 48(5): 99- 106.
URL
|
18 |
宜年, 艾山· 吾买尔, 买合木提· 买买提, 等. 基于多种数据筛选的维汉神经机器翻译. 厦门大学学报(自然科学版), 2022, 61(4): 660- 666.
URL
|
|
YI N, Aishan Wumaier, Maihemuti Maimaiti, et al. Uyghur-Chinese neural machine translation system based on multiple data filtering. Journal of Xiamen University (Natural Science), 2022, 61(4): 660- 666.
URL
|
19 |
LIU X E, HE J S, LIU M Z, et al. A scenario-generic neural machine translation data augmentation method. Electronics, 2023, 12(10): 2320.
doi: 10.3390/electronics12102320
|
20 |
PHAM N L, VAN VINH NGUYEN, PHAM T V. A data augmentation method for english-vietnamese neural machine translation. IEEE Access, 2023, 11, 28034- 28044.
|
21 |
YİRMİBEŞOǦLU Z, GÜNGÖR T. Morphologically motivated input variations and data augmentation in Turkish-English neural machine translation. ACM Transactions on Asian and Low-Resource Language Information Processing, 2023, 22(3): 92.
|
22 |
KARYUKIN V, RAKHIMOVA D, KARIBAYEVA A, et al. The neural machine translation models for the low-resource Kazakh-English language pair. PeerJ Computer Science, 2023, 9, e1224.
|
23 |
LI B, WENG Y X, XIA F, et al. Towards better Chinese-Centric neural machine translation for low-resource languages[EB/OL]. [2023-09-20]. http://arxiv.org/abs/2204.04344.
|
24 |
LI B, WENG Y X, SUN B, et al. A multi-tasking and multi-stage Chinese minority pre-trained language model[C]//Proceedings of China Conference on Machine Translation. Berlin, Germany: Springer, 2022: 93-105.
|
25 |
VAN H P, LE THANH H. Improving Khmer-Vietnamese machine translation with data augmentation methods[C]//Proceedings of the 11th International Symposium on Information and Communication Technology. New York, USA: ACM Press, 2022: 276-282.
|
26 |
TEAM N, COSTA-JUSSÀ M R, CROSS J, et al. No language left behind: scaling human-centered machine translation[EB/OL]. [2023-09-20]. http://arxiv.org/abs/2207.04672.
|
27 |
|
28 |
BRIAKOU E, CHERRY C, FOSTER G. Searching for needles in a haystack: on the role of incidental bilingualism in PaLM's translation capability[C]//Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2023: 9432- 9452.
|
29 |
VILAR D, FREITAG M, CHERRY C, et al. Prompting PaLM for translation: assessing strategies and performance[C]//Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2023: 15406-15427.
|
30 |
|
31 |
|
32 |
|
33 |
ZHU W H, LIU H Y, DONG Q X, et al. Multilingual machine translation with large language models: empirical results and analysis[EB/OL]. [2023-09-20]. http://arxiv.org/abs/2304.04675.
|
34 |
MU Y Y, REHEMAN A, CAO Z Q, et al. Augmenting large language model translators via translation memories[C]//Proceedings of the Findings of the Association for Computational Linguistics: ACL 2023. Stroudsburg, USA: Association for Computational Linguistics, 2023: 10287-10299.
|
35 |
|
36 |
ZHONG Q H, DING L, LIU J H, et al. Can ChatGPT understand too? A comparative study on ChatGPT and fine-tuned BERT[EB/OL]. [2023-09-20]. http://arxiv.org/abs/2302.10198.
|
37 |
TAN Z X, ZHANG X W, WANG S, et al. MSP: multi-stage prompting for making pre-trained language models better translators[C]//Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2022: 6131-6142.
|
38 |
|
39 |
HUANG X S, CHEN Y B, WU S, et al. Named entity recognition via noise aware training mechanism with data filter[C]//Proceedings of the Findings of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2021: 4791-4803.
|
40 |
SENNRICH R, HADDOW B, BIRCH A. Improving neural machine translation models with monolingual data[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, USA: Association for Computational Linguistics, 2016: 86-96.
|
41 |
|
42 |
PAPINENI K, ROUKOS S, WARD T, et al. BLEU: a method for automatic evaluation of machine translation[C]//Proceedings of the 40th Annual Meeting on Association for Computational Linguistics. New York, USA: ACM Press, 2002: 311-318.
|
43 |
POPOVIĆ M. chrF++: words helping character n-grams[C]//Proceedings of the 2nd Conference on Machine Translation. Stroudsburg, USA: Association for Computational Linguistics, 2017: 612-618.
|