Semi-Supervised Seq2seq Joint-Stochastic-Approximation Autoencoders With Applications to Semantic Parsing

Yunfu Song,Zhijian Ou

Published 2020 in IEEE Signal Processing Letters

ABSTRACT

Developing Semi-Supervised Seq2Seq (<inline-formula><tex-math notation="LaTeX">$S^4$</tex-math></inline-formula>) learning for sequence transduction tasks in natural language processing (NLP), e.g. semantic parsing, is challenging, since both the input and the output sequences are discrete. This discrete nature makes trouble for methods which need gradients either from the input space or from the output space. Recently, a new learning method called joint stochastic approximation is developed for unsupervised learning of fixed-dimensional autoencoders and theoretically avoids gradient propagation through discrete latent variables, which is suffered by Variational Auto-Encoders (VAEs). In this letter, we propose seq2seq Joint-stochastic-approximation Auto-Encoders (JAEs) and apply them to <inline-formula><tex-math notation="LaTeX">$S^4$</tex-math></inline-formula> learning for NLP sequence transduction tasks. Further, we propose bi-directional JAEs (called bi-JAEs) to leverage not only unpaired input sequences (which is most commonly studied) but also unpaired output sequences. Experiments on two benchmarking datasets for semantic parsing show that JAEs consistently outperform VAEs in <inline-formula><tex-math notation="LaTeX">$S^4$</tex-math></inline-formula> learning and bi-JAEs yield further improvements.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-18 of 18 references · Page 1 of 1