On the Expressive Power of Deep Neural Networks

M. Raghu,Ben Poole,J. Kleinberg,S. Ganguli,Jascha Narain Sohl-Dickstein

Published 2016 in International Conference on Machine Learning

ABSTRACT

We propose a new approach to the problem of neural network expressivity, which seeks to characterize how structural properties of a neural network family affect the functions it is able to compute. Our approach is based on an interrelated set of measures of expressivity, unified by the novel notion of trajectory length, which measures how the output of a network changes as the input sweeps along a one-dimensional path. Our findings can be summarized as follows: (1) The complexity of the computed function grows exponentially with depth. (2) All weights are not equal: trained networks are more sensitive to their lower (initial) layer weights. (3) Regularizing on trajectory length (trajectory regularization) is a simpler alternative to batch normalization, with the same performance.

PUBLICATION RECORD

  • Publication year

    2016

  • Venue

    International Conference on Machine Learning

  • Publication date

    2016-06-16

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-41 of 41 references · Page 1 of 1

CITED BY

Showing 1-100 of 864 citing papers · Page 1 of 9