Enhancing Pretrained Multilingual Machine Translation Model with Code-Switching: A Study on Chinese, English and Malay Language

In the field of multilingual machine translation, many pretrained language models have achieved the inspiring results. However, the results based on pretrained models are not yet very satisfactory for low-resource languages. This paper investigates how to leverage code-switching data to fine-tune pr...

وصف كامل

التفاصيل البيبلوغرافية
الحاوية / القاعدة:ICCPR 2024 - Proceedings of the 2024 13th International Conference on Computing and Pattern Recognition
المؤلف الرئيسي: Liu H.; Seman N.
التنسيق: Conference paper
اللغة:English
منشور في: Association for Computing Machinery, Inc 2025
الوصول للمادة أونلاين:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85218347057&doi=10.1145%2f3704323.3704346&partnerID=40&md5=da2c58dd104d36641e0865c691fbe6df

مواد مشابهة