Effective Approaches to Attention-based Neural Machine Translation

Thang Luong,Hieu Pham,Christopher D. Manning

Published 2015 in Conference on Empirical Methods in Natural Language Processing

ABSTRACT

An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. However, there has been little work exploring useful architectures for attention-based NMT. This paper examines two simple and effective classes of attentional mechanism: a global approach which always attends to all source words and a local one that only looks at a subset of source words at a time. We demonstrate the effectiveness of both approaches on the WMT translation tasks between English and German in both directions. With local attention, we achieve a significant gain of 5.0 BLEU points over non-attentional systems that already incorporate known techniques such as dropout. Our ensemble model using different attention architectures yields a new state-of-the-art result in the WMT’15 English to German translation task with 25.9 BLEU points, an improvement of 1.0 BLEU points over the existing best system backed by NMT and an n-gram reranker. 1

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

LINKED PAPERS

  • Neural Machine Translation of Rare Words with Subword Units
    20152 semantic links2 concept links0 claim links
    • wmt 15 part of · The WMT English-German translation task is an evaluation task within the WMT 15 benchmark.
    • bleu related to · BLEU and BLEU score refer to the same automatic machine-translation evaluation metric for translation quality.

CONCEPTS

CITED BY

Showing 1-100 of 8320 citing papers · Page 1 of 84