loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Andargachew Mekonnen Gezmu and Andreas Nürnberger

Affiliation: Faculty of Computer Science, Otto von Guericke Universität Magdeburg, Universitätsplatz 2, Magdeburg, Germany

Keyword(s): Neural Machine Translation, Transformer, Less-resourced Language, Polysynthetic Language.

Abstract: The recent advances in neural machine translation enable it to be state-of-the-art. However, although there are significant improvements in neural machine translation for a few high-resource languages, its performance is still low for less-resourced languages as the amount of training data significantly affects the quality of the machine translation models. Therefore, identifying a neural machine translation architecture that can train the best models in low-data conditions is essential for less-resourced languages. This research modified the Transformer-based neural machine translation architectures for low-resource polysynthetic languages. Our proposed system outperformed the strong baseline in the automatic evaluation of the experiments on the public benchmark datasets.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 44.197.231.211

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Gezmu, A. and Nürnberger, A. (2022). Transformers for Low-resource Neural Machine Translation. In Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI; ISBN 978-989-758-547-0; ISSN 2184-433X, SciTePress, pages 459-466. DOI: 10.5220/0010971500003116

@conference{nlpinai22,
author={Andargachew Mekonnen Gezmu. and Andreas Nürnberger.},
title={Transformers for Low-resource Neural Machine Translation},
booktitle={Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI},
year={2022},
pages={459-466},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0010971500003116},
isbn={978-989-758-547-0},
issn={2184-433X},
}

TY - CONF

JO - Proceedings of the 14th International Conference on Agents and Artificial Intelligence - Volume 1: NLPinAI
TI - Transformers for Low-resource Neural Machine Translation
SN - 978-989-758-547-0
IS - 2184-433X
AU - Gezmu, A.
AU - Nürnberger, A.
PY - 2022
SP - 459
EP - 466
DO - 10.5220/0010971500003116
PB - SciTePress