Neural Machine Translation (NMT) has made significant strides in recent years, especially with the advent of deep learning, which has greatly enhanced performance across various Natural Language Processing (NLP) tasks. Despite these advances, NMT still falls short of perfect translation, facing ongoing challenges such as limited training data, handling rare words, and managing syntactic and semantic dependencies. This study introduces a multichannel character-level NMT model with hybrid attention for Arabic-English translation. The proposed approach addresses issues such as rare words and word alignment by encoding characters, incorporating Arabic word segmentation as handcrafted features, and using part-of-speech tagging in a multichannel CNN-BiLSTM encoder. The model then uses a Bi-LSTM decoder with hybrid attention to generate target language sentences. The proposed model was tested on a subset of the OPUS-100 dataset, achieving promising results.
Enhancing Neural Arabic Machine Translation using Character-Level CNN-BILSTM and Hybrid Attention
Published 2024 in Engineering, Technology & Applied Science Research
ABSTRACT
PUBLICATION RECORD
- Publication year
2024
- Venue
Engineering, Technology & Applied Science Research
- Publication date
2024-10-09
- Fields of study
Not labeled
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-9 of 9 references · Page 1 of 1
CITED BY
Showing 1-6 of 6 citing papers · Page 1 of 1