Skip to main content

Low Resource Machine Translation

  • Chapter
  • First Online:
Low Resource Social Media Text Mining

Abstract

We discuss the burgeoning field of unsupervised machine translation, where words and phrases are translated between languages without any parallel corpora. We discuss popular methods, and applications to low-resource settings. We further investigate the application of polyglot training to this field and present new promising directions for unsupervised machine translation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Alsulami A (2019) A sociolinguistic analysis of the use of Arabizi in social media among Saudi Arabians. Int J Engl Linguis 9:257. https://doi.org/10.5539/ijel.v9n6p257

    Article  Google Scholar 

  2. Artetxe M, Labaka G, Agirre E (2017) Learning bilingual word embeddings with (almost) no bilingual data. In: ACL

    Google Scholar 

  3. Conneau A, Lample G, Ranzato M, Denoyer L, Jégou H (2017) Word translation without parallel data. arXiv preprint arXiv:171004087

  4. Edunov S, Ott M, Auli M, Grangier D (2018) Understanding back-translation at scale. In: Proceedings of the 2018 conference on empirical methods in natural language processing, Association for Computational Linguistics, Brussels, pp 489–500. https://doi.org/10.18653/v1/D18-1045. https://aclanthology.org/D18-1045

  5. Garcia X, Siddhant A, Firat O, Parikh A (2021) Harnessing multilinguality in unsupervised machine translation for rare languages. In: Proceedings of the 2021 conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, Online, pp 1126–1137. https://doi.org/10.18653/v1/2021.naacl-main.89. https://aclanthology.org/2021.naacl-main.89

  6. Guzmán F, Chen PJ, Ott M, Pino J, Lample G, Koehn P, Chaudhary V, Ranzato M (2019) The FLORES evaluation datasets for low-resource machine translation: Nepali–English and Sinhala–English. In: Proceedings of the 2019 conference on Empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), Association for Computational Linguistics, Hong Kong, China, pp 6098–6111. https://doi.org/10.18653/v1/D19-1632. https://aclanthology.org/D19-1632

  7. Karras T, Aittala M, Hellsten J, Laine S, Lehtinen J, Aila T (2020) Training generative adversarial networks with limited data. In: Larochelle H, Ranzato M, Hadsell R, Balcan MF, Lin H (eds) Advances in neural information processing systems. Curran Associates, Inc., vol 33, pp 12104–12114. https://proceedings.neurips.cc/paper/2020/file/8d30aa96e72440759f74bd2306c1fa3d-Paper.pdf

  8. KhudaBukhsh AR, Palakodety S, Mitchell TM (2020) Discovering bilingual lexicons in polyglot word embeddings. CoRR abs/2008.13347, https://arxiv.org/abs/2008.13347, 2008.13347

  9. Lample G, Ott M, Conneau A, Denoyer L, Ranzato M (2018) Phrase-based & neural unsupervised machine translation. In: Proceedings of the 2018 conference on empirical methods in natural language processing. Association for Computational Linguistics, Brussels, pp 5039–5049. https://doi.org/10.18653/v1/D18-1549. https://aclanthology.org/D18-1549

  10. Liu Y, Gu J, Goyal N, Li X, Edunov S, Ghazvininejad M, Lewis M, Zettlemoyer L (2020) Multilingual denoising pre-training for neural machine translation. Trans Associ Comput Linguist 8:726–742

    Article  Google Scholar 

  11. Marchisio K, Duh K, Koehn P (2020) When does unsupervised machine translation work? In: Proceedings of the fifth conference on machine translation. Association for Computational Linguistics, Online, pp 571–583. https://aclanthology.org/2020.wmt-1.68

  12. Mikolov T, Le QV, Sutskever I (2013) Exploiting similarities among languages for machine translation. ArXiv abs/1309.4168

    Google Scholar 

  13. Palakodety S, KhudaBukhsh AR, Carbonell JG (2020) Hope speech detection: a computational analysis of the voice of peace. In: Giacomo GD, Catalá A, Dilkina B, Milano M, Barro S, Bugarín A, Lang J (eds) ECAI 2020—24th European conference on artificial intelligence. Frontiers in artificial intelligence and applications. IOS Press, vol 325, pp 1881–1889. https://doi.org/10.3233/FAIA200305. https://doi.org/10.3233/FAIA200305

  14. Sennrich R, Haddow B, Birch A (2016) Improving neural machine translation models with monolingual data. In: Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Association for Computational Linguistics, Berlin, pp 86–96. https://doi.org/10.18653/v1/P16-1009. https://aclanthology.org/P16-1009

  15. Stahlberg F (2019) Neural machine translation: a review and survey. arXiv: Computation and Language

  16. Takase S, Kiyono S (2021a) Lessons on parameter sharing across layers in transformers. ArXiv abs/2104.06022

    Google Scholar 

  17. Takase S, Kiyono S (2021b) Rethinking perturbations in encoder-decoders for fast training. In: Proceedings of the 2021 conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. Association for Computational Linguistics, Online, pp 5767–5780. https://doi.org/10.18653/v1/2021.naacl-main.460. https://aclanthology.org/2021.naacl-main.460

  18. Vincent P, Larochelle H, Bengio Y, Manzagol PA (2008) Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th international conference on Machine learning, pp 1096–1103

    Google Scholar 

  19. Vulić I, Glavaš G, Reichart R, Korhonen A (2019) Do we really need fully unsupervised cross-lingual embeddings? In: Proceedings of the 2019 conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP). Association for Computational Linguistics, Hong Kong, pp 4407–4418. https://doi.org/10.18653/v1/D19-1449. https://aclanthology.org/D19-1449

  20. WMT20 (2021) Fifth conference on machine translation (wmt20). https://www.statmt.org/wmt20/

  21. WMT21 (2021) Sixth conference on machine translation (wmt21). https://www.statmt.org/wmt21/

  22. Zhang Y (2017) The semiotic multifunctionality of Arabic numerals in Chinese online discourse. Language@Internet 14

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shriphani Palakodety .

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Palakodety, S., KhudaBukhsh, A.R., Jayachandran, G. (2021). Low Resource Machine Translation. In: Low Resource Social Media Text Mining. SpringerBriefs in Computer Science. Springer, Singapore. https://doi.org/10.1007/978-981-16-5625-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-5625-5_5

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-5624-8

  • Online ISBN: 978-981-16-5625-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics