Generating Sequences With Recurrent Neural Networks

Alex Graves

Published 2013 in arXiv.org

ABSTRACT

This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.

PUBLICATION RECORD

  • Publication year

    2013

  • Venue

    arXiv.org

  • Publication date

    2013-08-04

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-33 of 33 references · Page 1 of 1

CITED BY

Showing 1-100 of 4252 citing papers · Page 1 of 43