Enhancing Pretrained Multilingual Machine Translation Model with Code-Switching: A Study on Chinese, English and Malay Language
In the field of multilingual machine translation, many pretrained language models have achieved the inspiring results. However, the results based on pretrained models are not yet very satisfactory for low-resource languages. This paper investigates how to leverage code-switching data to fine-tune pr...
出版年: | ICCPR 2024 - Proceedings of the 2024 13th International Conference on Computing and Pattern Recognition |
---|---|
第一著者: | |
フォーマット: | Conference paper |
言語: | English |
出版事項: |
Association for Computing Machinery, Inc
2025
|
オンライン・アクセス: | https://www.scopus.com/inward/record.uri?eid=2-s2.0-85218347057&doi=10.1145%2f3704323.3704346&partnerID=40&md5=da2c58dd104d36641e0865c691fbe6df |