Good old online backpropagation for plain multilayer perceptrons yields a very low 0.35 error rate on the MNIST handwritten digits benchmark. All we need to achieve this best result so far are many hidden layers, many neurons per layer, numerous deformed training images to avoid overfitting, and graphics cards to greatly speed up learning.
Deep, Big, Simple Neural Nets for Handwritten Digit Recognition
D. Ciresan,U. Meier,L. Gambardella,J. Schmidhuber
Published 2010 in Neural Computation
ABSTRACT
PUBLICATION RECORD
- Publication year
2010
- Venue
Neural Computation
- Publication date
2010-03-01
- Fields of study
Mathematics, Computer Science, Medicine
- Identifiers
- External record
- Source metadata
Semantic Scholar, PubMed
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-26 of 26 references · Page 1 of 1