Reconciling Feature-Reuse and Overfitting in DenseNet with Specialized Dropout

Kun Wan,Boyuan Feng,Lingwei Xie,Yufei Ding

Published 2018 in IEEE International Conference on Tools with Artificial Intelligence

ABSTRACT

Recently convolutional neural networks (CNNs) achieve great accuracy in visual recognition tasks. DenseNets become one of the most popular CNN models due to its effectiveness in the feature-reuse. However, like other CNN models, DenseNets also face the overfitting problem if not more severe. Existing dropout methods can be applied but not effective. In particular, the property of the feature-reuse in DenseNets will be impeded, and the dropout effect will be weakened by the spatial correlation inside feature maps. To address these problems, we craft the design of a specialized dropout method from three aspects, the dropout location, the dropout granularity, and the dropout probability. The insights attained here could potentially be applied as a general approach for boosting the accuracy of other CNN models with similar shortcut connections. Experimental results show that DenseNets with our specialized dropout method yield better accuracies compared to vanilla DenseNets and state-of-the-art CNN models, and such accuracy boost increases with the model depth.

PUBLICATION RECORD

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-24 of 24 references · Page 1 of 1

CITED BY

Showing 1-13 of 13 citing papers · Page 1 of 1