loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Alexander Smirnov 1 ; Nikolay Teslya 1 ; a Nikolay Shilov 1 ; Diethard Frank 2 ; Elena Minina 2 and Martin Kovacs 2

Affiliations: 1 SPIIRAS, SPC RAS, 14th line 39, St. Petersburg, Russia ; 2 Festo SE & Co. KG, Ruiter Str. 82, Esslingen, Germany

Keyword(s): Machine Translation, DNN Translation, Comparison, Training, Transformers, Fine-tuning.

Abstract: While processing customers’ feedback for an industrial company, one of the important tasks is the classification of customer inquiries. However, this task can produce a number of difficulties when the text of the message can be composed using a large number of languages. One of the solutions, in this case, is to determine the language of the text and translate it into a base language, for which the classifier will be developed. This paper compares open models for automatic translation of texts. The following models based on the Transformers architecture were selected for comparison: M2M100, mBART, OPUS-MT (Helsinki NLP). A test data set was formed containing texts specific to the subject area. Microsoft Azure Translation was chosen as the reference translation. Translations produced by each model were compared with the reference translation using two metrics: BLEU and METEOR. The possibility of fast fine-tuning of models was also investigated to improve the quality of the translation of texts in the problem area. Among the reviewed models, M2M100 turned out to be the best in terms of translation quality, but it is also the most difficult to fine-tune it. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.145.186.6

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Smirnov, A.; Teslya, N.; Shilov, A.; Frank, D.; Minina, E. and Kovacs, M. (2022). Comparative Analysis of Neural Translation Models based on Transformers Architecture. In Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 1: ICEIS; ISBN 978-989-758-569-2; ISSN 2184-4992, SciTePress, pages 586-593. DOI: 10.5220/0011083600003179

@conference{iceis22,
author={Alexander Smirnov. and Nikolay Teslya. and a Nikolay Shilov. and Diethard Frank. and Elena Minina. and Martin Kovacs.},
title={Comparative Analysis of Neural Translation Models based on Transformers Architecture},
booktitle={Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 1: ICEIS},
year={2022},
pages={586-593},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011083600003179},
isbn={978-989-758-569-2},
issn={2184-4992},
}

TY - CONF

JO - Proceedings of the 24th International Conference on Enterprise Information Systems - Volume 1: ICEIS
TI - Comparative Analysis of Neural Translation Models based on Transformers Architecture
SN - 978-989-758-569-2
IS - 2184-4992
AU - Smirnov, A.
AU - Teslya, N.
AU - Shilov, A.
AU - Frank, D.
AU - Minina, E.
AU - Kovacs, M.
PY - 2022
SP - 586
EP - 593
DO - 10.5220/0011083600003179
PB - SciTePress