A Study of Levenshtein Transformer and Editor Transformer Models for Under-Resourced Languages

dc.contributor.authorMya Ei San
dc.contributor.authorYe Kyaw Thu
dc.contributor.authorZar Zar Hlaing
dc.contributor.authorHlaing Myat Nwe
dc.contributor.authorThepchai Supnithi
dc.contributor.authorSasiporn Usanavasin
dc.date.accessioned2026-05-08T19:21:55Z
dc.date.issued2021-12-21
dc.description.abstractTransformers are the current state-of-the-art type of neural network model for dealing with sequences. Evidently, the most prominent application of these models is in text processing tasks, and the most prominent of these is machine translation. Recently, transformer-based models such as the Edit-Based Transformer with Repositioning (EDITOR) and Levenshtein Transformer (LevT) models have become popular in neural machine translation. To the best of our knowledge, there are no experiments for these two models using under-resourced languages. In this paper, we compared the performance and decoding time of the EDITOR model and the LevT model. We conducted the experiments for under-resourced language pairs, namely, Thai-to-English, Thai-to-Myanmar, English-to-Myanmar, and vice versa. The experimental results showed that the EDITOR model outperforms the LevT model in English-Thai, Thai-English and English-Myanmar language pairs whereas LevT achieves better score than EDITOR in Thai-Myanmar, Myanmar-Thai and Myanmar-English language pairs. Regarding the decoding time, EDITOR model is generally faster than the LevT model in the four language pairs. However, in the case of English-Myanmar and Myanmar-English pairs, the decoding time of EDITOR is slightly slower than the LevT model. At last, we investigated the system level performance of both models by means of compare-mt and word error rate (WER).
dc.identifier.doi10.1109/isai-nlp54397.2021.9678159
dc.identifier.urihttps://dspace.kmitl.ac.th/handle/123456789/18284
dc.subjectNatural Language Processing Techniques
dc.subjectTopic Modeling
dc.subjectHandwritten Text Recognition Techniques
dc.titleA Study of Levenshtein Transformer and Editor Transformer Models for Under-Resourced Languages
dc.typeArticle

Files

Collections