Using Fast Weights to Attend to the Recent Past

Jimmy Ba,Geoffrey E. Hinton,Volodymyr Mnih,Joel Z. Leibo,Catalin Ionescu

Published 2016 in Neural Information Processing Systems

ABSTRACT

Until recently, research on artificial neural networks was largely restricted to systems with only two types of variable: Neural activities that represent the current or recent input and weights that learn to capture regularities among inputs, outputs and payoffs. There is no good reason for this restriction. Synapses have dynamics at many different time-scales and this suggests that artificial neural networks might benefit from variables that change slower than activities but much faster than the standard weights. These ``fast weights'' can be used to store temporary memories of the recent past and they provide a neurally plausible way of implementing the type of attention to the past that has recently proven helpful in sequence-to-sequence models. By using fast weights we can avoid the need to store copies of neural activity patterns.

PUBLICATION RECORD

  • Publication year

    2016

  • Venue

    Neural Information Processing Systems

  • Publication date

    2016-10-20

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-28 of 28 references · Page 1 of 1

CITED BY

Showing 1-100 of 306 citing papers · Page 1 of 4