Multi-Scale Dense Convolutional Networks for Efficient Prediction

Gao Huang,Tianhong Li,Felix Wu,L. Maaten,Kilian Q. Weinberger

Published 2017 in arXiv.org

ABSTRACT

We introduce a new convolutional neural network architecture with the ability to adapt dynamically to computational resource limits at test time. Our network architecture uses progressively growing multi-scale convolutions and dense connectivity, which allows for the training of multiple classifiers at intermediate layers of the network. We evaluate our approach in two settings: (1) anytime classification, where the network's prediction for a test example is progressively updated, facilitating the output of a prediction at any time; and (2) budgeted batch classification, where a fixed amount of computation is available to classify a set of examples that can be spent unevenly across"easier"and"harder"inputs. Experiments on three image-classification datasets demonstrate that our proposed framework substantially improves the state-of-the-art in both settings.

PUBLICATION RECORD

  • Publication year

    2017

  • Venue

    arXiv.org

  • Publication date

    2017-03-29

  • Fields of study

    Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CONCEPTS

REFERENCES

Showing 1-45 of 45 references · Page 1 of 1

CITED BY

Showing 1-100 of 147 citing papers · Page 1 of 2