Latent Sequence Decompositions

William Chan,Yu Zhang,Quoc V. Le,N. Jaitly

Published 2016 in International Conference on Learning Representations

ABSTRACT

We present the Latent Sequence Decompositions (LSD) framework. LSD decomposes sequences with variable lengthed output units as a function of both the input sequence and the output sequence. We present a training algorithm which samples valid extensions and an approximate decoding algorithm. We experiment with the Wall Street Journal speech recognition task. Our LSD model achieves 12.9% WER compared to a character baseline of 14.8% WER. When combined with a convolutional network on the encoder, we achieve 9.6% WER.

PUBLICATION RECORD

  • Publication year

    2016

  • Venue

    International Conference on Learning Representations

  • Publication date

    2016-10-10

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-39 of 39 references · Page 1 of 1

CITED BY

Showing 1-62 of 62 citing papers · Page 1 of 1