This study examines multilingual neural machine translation (MNMT) for a diverse group of low-resource Asian languages-Bengali, Filipino, Indonesian, Japanese, Khmer, Malay, and Vietnamese-which differ substantially in linguistic families, writing systems, and typology. This paper evaluates state-of-the-art MNMT systems and introduces a Compact & Language-Sensitive MNMT model designed to improve translation performance while reducing computational cost. The proposed approach shares parameters through a compact multilingual representation, and enhances language discrimination using language-sensitive embeddings, a language-sensitive discriminator, and an adaptive cross-attention mechanism that selects attention parameters based on specific language pairs. Integrated with a multi-stage fine-tuning strategy, this model effectively strengthens cross-lingual transfer while maintaining robust language-specific representations. Experiments on the ALT multi-parallel corpus and the KFTT English-Japanese dataset demonstrate that multilingual models significantly outperform single-language NMT baselines. Despite its smaller size, the proposed Compact & Language-Sensitive MNMT achieves competitive or superior BLEU scores compared to Google’s MNMT, confirming the effectiveness of guided parameter sharing and language-sensitive training. These results highlight the value of compact multilingual architectures and multi-parallel datasets for advancing low-resource Asian machine translation.
Multilingual Neural Machine Translation for Asian Language Treebank
Published 2026 in Journal of Technical Education Science
ABSTRACT
PUBLICATION RECORD
- Publication year
2026
- Venue
Journal of Technical Education Science
- Publication date
2026-02-28
- Fields of study
Not labeled
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-20 of 20 references · Page 1 of 1
CITED BY
- No citing papers are available for this paper.
Showing 0-0 of 0 citing papers · Page 1 of 1