Zero-Resource Translation with Multi-Lingual Neural Machine Translation

Orhan Firat,B. Sankaran,Yaser Al-Onaizan,F. Yarman-Vural,Kyunghyun Cho

Published 2016 in Conference on Empirical Methods in Natural Language Processing

ABSTRACT

In this paper, we propose a novel finetuning algorithm for the recently introduced multi-way, mulitlingual neural machine translate that enables zero-resource machine translation. When used together with novel many-to-one translation strategies, we empirically show that this finetuning algorithm allows the multi-way, multilingual model to translate a zero-resource language pair (1) as well as a single-pair neural translation model trained with up to 1M direct parallel sentences of the same language pair and (2) better than pivot-based translation strategy, while keeping only one additional copy of attention-related parameters.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-29 of 29 references · Page 1 of 1

CITED BY

Showing 1-100 of 281 citing papers · Page 1 of 3