Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems

Colin Raffel,D. Ellis

Published 2015 in arXiv.org

ABSTRACT

We propose a simplified model of attention which is applicable to feed-forward neural networks and demonstrate that the resulting model can solve the synthetic "addition" and "multiplication" long-term memory problems for sequence lengths which are both longer and more widely varying than the best published results for these tasks.

PUBLICATION RECORD

  • Publication year

    2015

  • Venue

    arXiv.org

  • Publication date

    2015-12-29

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-30 of 30 references · Page 1 of 1

CITED BY

Showing 1-100 of 319 citing papers · Page 1 of 4