myContradict: Semi-supervised Contradictory Sentence Generation for Myanmar language
| dc.contributor.author | Ye Kyaw Thu | |
| dc.contributor.author | Ei Myat Nwe | |
| dc.contributor.author | Thura Aung | |
| dc.date.accessioned | 2026-05-08T19:24:23Z | |
| dc.date.issued | 2024-11-11 | |
| dc.description.abstract | In this paper, we proposed a semi-supervised low-resource approach for generating contradictory sentences in the Myanmar language using the self-training perspective. This semi-supervised learning method leverages Bidirectional Long Short-Term Memory neural network (BiLSTM) and Transformer encoder-decoder models. All of the experiments were done in both syllable- and word-level tokens. We also investigated the importance of linguistic features in the Myanmar language for generating contradictory sentences. Our approach found improvement in BiLSTM models with and without POS Tag features for both tokenization schemes. The best contradictory sentence generation model is the syllable-level Transformer model using the self-training approach with only one iteration, which gained the chrF<sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">++</sup> score of 0.75. It also gained 0.86 Precision, 0.85 Recall, and 0.85 F1-score of ROUGE-L scores and 0.94 Precision, 0.94 Recall, and 0.94 F1-score of BERTScore. | |
| dc.identifier.doi | 10.1109/isai-nlp64410.2024.10799350 | |
| dc.identifier.uri | https://dspace.kmitl.ac.th/handle/123456789/19566 | |
| dc.subject | Topic Modeling | |
| dc.subject | Natural Language Processing Techniques | |
| dc.subject | Hate Speech and Cyberbullying Detection | |
| dc.title | myContradict: Semi-supervised Contradictory Sentence Generation for Myanmar language | |
| dc.type | Article |