Local Learning in RRAM Neural Networks with Sparse Direct Feedback Alignment

Brian Crafton,Mathew P. West,Padip Basnet,E. Vogel,A. Raychowdhury

Published 2019 in International Symposium on Low Power Electronics and Design

ABSTRACT

Neural networks utilizing non-volatile random access memory (NVM) exhibit excellent power reduction over traditional CMOS implementations. RRAM (resistive random access memory) is one such emerging memory technology offering low energy, good endurance, and a large analog conductance window. When implemented in a crossbar architecture, these networks are able to bypass the von-Neumann bottleneck by performing compute in-memory. This architecture works well for inference; however, training the network is far more challenging. Networks built using RRAM can be trained on-chip with gradient descent or off-chip where weights are transferred. Backpropagation, while effective in training von-Neumann architectures, is inefficient when memory and compute are partitioned together. Commonly referred to as the weight transport problem, each neuron's dependence on the weights and errors located deeper in the network requires reading the weights in each layer before computing and applying the error. This presents a key challenge in performing efficient on chip training for non von-Neumann architectures. In this work we demonstrate an alternative to backpropagation called sparse direct feedback alignment which bypasses the weight transport problem. We simulate crossbars of HfOx RRAM based on experimental data to explore the performance, area, and energy trade-offs of using bio-plausible algorithms on the MNIST and EMNIST datasets.

PUBLICATION RECORD

  • Publication year

    2019

  • Venue

    International Symposium on Low Power Electronics and Design

  • Publication date

    2019-07-01

  • Fields of study

    Computer Science, Engineering

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-14 of 14 references · Page 1 of 1

CITED BY

Showing 1-14 of 14 citing papers · Page 1 of 1