myContradict: Semi-supervised Contradictory Sentence Generation for Myanmar language
Loading...
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In this paper, we proposed a semi-supervised low-resource approach for generating contradictory sentences in the Myanmar language using the self-training perspective. This semi-supervised learning method leverages Bidirectional Long Short-Term Memory neural network (BiLSTM) and Transformer encoder-decoder models. All of the experiments were done in both syllable- and word-level tokens. We also investigated the importance of linguistic features in the Myanmar language for generating contradictory sentences. Our approach found improvement in BiLSTM models with and without POS Tag features for both tokenization schemes. The best contradictory sentence generation model is the syllable-level Transformer model using the self-training approach with only one iteration, which gained the chrF<sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">++</sup> score of 0.75. It also gained 0.86 Precision, 0.85 Recall, and 0.85 F1-score of ROUGE-L scores and 0.94 Precision, 0.94 Recall, and 0.94 F1-score of BERTScore.