We propose a simplified model of attention which is applicable to feed-forward neural networks and demonstrate that the resulting model can solve the synthetic "addition" and "multiplication" long-term memory problems for sequence lengths which are both longer and more widely varying than the best published results for these tasks.
Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems
Published 2015 in arXiv.org
ABSTRACT
PUBLICATION RECORD
- Publication year
2015
- Venue
arXiv.org
- Publication date
2015-12-29
- Fields of study
Mathematics, Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-30 of 30 references · Page 1 of 1