![]() Gu, J., Wang, C., Zhao, J.: Levenshtein transformer. Lopes, A.V., Farajian, M.A., Correia, G.M., Trénous, J., Martins, A.F.: Unbabel’s submission to the WMT2019 APE shared task: Bert-based encoder-decoder for automatic post-editing. Association for Computational Linguistics (2019) (eds.) Proceedings of NAACL-HLT 2019, Minneapolis, MN, USA, 2–7 June 2019, Volume 1 (Long and Short Papers), pp. 5998–6008 (2017)ĭevlin, J., Chang, M.-W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. (eds.) Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, Long Beach, CA, USA, 4–9 December 2017, pp. Vaswani, A., et al.: Attention is all you need. Association for Computational Linguistics, November 2020 (Online) In: Proceedings of the Fifth Conference on Machine Translation, pp. Association for Computational Linguistics, July 2002Ĭhatterjee, R., Freitag, M., Negri, M., Turchi, M.: Findings of the WMT 2020 shared task on automatic post-editing. In: Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, Philadelphia, Pennsylvania, USA, pp. Papineni, K., Roukos, S., Ward, T., Zhu, W.-J.: BLEU: a method for automatic evaluation of machine translation. Snover, M., Dorr, B.J., Schwartz, R., Micciulla, L.: A study of translation edit rate with targeted human annotation (2006) In: International Conference on Learning Representations (2018) Gu, J., Bradbury, J., Xiong, C., Li, V.O.K., Socher, R.: Non-autoregressive neural machine translation. Association for Computational Linguistics (2020, online) (eds.) Proceedings of the Fifth Conference on Machine Translation, 2020, 19–20 November 2020, pp. ![]() 1–88 (2021)Ĭhatterjee, R., Federmann, C., Negri, M., Turchi, M.: Findings of the WMT 2020 shared task on automatic post-editing. In: Proceedings of the Sixth Conference on Machine Translation, pp. Association for Machine Translation in the Americas, March 2018Īkhbardeh, F., et al.: Findings of the 2021 conference on machine translation (WMT21). Junczys-Dowmunt, M.: Are we experiencing the golden age of automatic post-editing? In: Proceedings of the AMTA 2018 Workshop on Translation Quality Estimation and Automatic Post-Editing, Boston, MA, pp. Association for Computational Linguistics, September 2015 ![]() In: Proceedings of the Tenth Workshop on Statistical Machine Translation, Lisbon, Portugal, pp. 32 (2019)īojar, O., et al.: Findings of the 2015 workshop on statistical machine translation. In: Advances in Neural Information Processing Systems, vol. Yang, H., et al.: HW-TSC’s participation at WMT 2020 automatic post editing shared task. Association for Computational Linguistics, November 2021 (Online) KeywordsĪkhbardeh, F., et al.: Findings of the 2021 conference on machine translation (WMT21). The corpus has been released in the CCMT 2022 APE evaluation task and the baseline models will be open-sourced. Experiments show that both the mainstream models of AR and NAR can effectively improve the effect of APE. (2) Non-Autoregressive Translation APE model (NAR-APE) based on the well-known Levenshtein Transformer. To obtain a more comprehensive investigation on the presented corpus, this paper provides two mainstream models as the Cookbook baselines: (1) Autoregressive Translation APE model (AR-APE) based on HW-TSC APE 2020, which is the SOTA model of WMT 2020 APE tasks. ![]() This corpus is much more practical than that provided in WMT 2021 APE tasks (18.05 TER/71.07 BLEU for En-De, 22.73 TER/69.2 BLEU for En-Zh). This paper provides a mobile domain APE corpus with 50.1 TER/37.4 BLEU for the En-Zh language pair. Training on APE models has made a great progress since 2015 however, whether APE models are really performing well on domain samples remains as an open question, and achieving this is still a hard task. Automatic post-editing (APE) aims to improve machine translations, thereby reducing human post-editing efforts.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |