An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches on the WMT translation tasks between English and German in both directions. With local attention, we achieve a significant gain of 5.0 BLEU points over non-attentional systems that already incorporate known techniques such as dropout. Our ensemble model using different attention architectures yields a new state-of-the-art result in the WMT’15 English to German translation task with 25.9 BLEU points, an improvement of 1.0 BLEU points over the existing best system backed by NMT and an n-gram reranker. 1
Effective Approaches to Attention-based Neural Machine Translation
Thang Luong,Hieu Pham,Christopher D. Manning
Published 2015 in Conference on Empirical Methods in Natural Language Processing
ABSTRACT
PUBLICATION RECORD
- Publication year
2015
- Venue
Conference on Empirical Methods in Natural Language Processing
- Publication date
2015-08-17
- Fields of study
Linguistics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
CONCEPTS
- bleu score
An automatic metric used here to evaluate translation quality.
Aliases: BLEU
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review - dropout
A regularization technique used in the baseline translation systems described in the abstract.
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review - ensemble model
A combined model that uses multiple attention architectures together for translation.
Aliases: ensemble
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review - global attention
An attentional mechanism that always attends to all source words during translation.
Aliases: global attentional mechanism, global approach
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review - local attention
An attentional mechanism that focuses on a subset of source words at each translation step.
Aliases: local attentional mechanism, local approach
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review - n-gram reranker
An existing reranking component used in the previously best system cited for comparison.
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review - non-attentional system
A neural machine translation baseline that does not use an attention mechanism but includes other known techniques.
Aliases: non-attentional systems
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review - wmt english-german translation task
The machine translation evaluation setting involving translation between English and German in the WMT benchmark.
Aliases: WMT'15 English-to-German translation task, English-German translation task, WMT translation tasks between English and German
q (76h6bfydm6) extractionAnonymous (12632b8b5f) review
REFERENCES
Showing 1-16 of 16 references · Page 1 of 1