Enhancing Pretrained Multilingual Machine Translation Model with Code-Switching: A Study on Chinese, English and Malay Language

In the field of multilingual machine translation, many pretrained language models have achieved the inspiring results. However, the results based on pretrained models are not yet very satisfactory for low-resource languages. This paper investigates how to leverage code-switching data to fine-tune pr...

全面介紹

書目詳細資料
發表在:ICCPR 2024 - Proceedings of the 2024 13th International Conference on Computing and Pattern Recognition
主要作者: Liu H.; Seman N.
格式: Conference paper
語言:English
出版: Association for Computing Machinery, Inc 2025
在線閱讀:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85218347057&doi=10.1145%2f3704323.3704346&partnerID=40&md5=da2c58dd104d36641e0865c691fbe6df