Progressive Growing of GANs for Improved Quality, Stability, and Variation

Tero Karras,Timo Aila,S. Laine,J. Lehtinen

Published 2017 in International Conference on Learning Representations

ABSTRACT

We describe a new training methodology for generative adversarial networks. The key idea is to grow both the generator and discriminator progressively: starting from a low resolution, we add new layers that model increasingly fine details as training progresses. This both speeds the training up and greatly stabilizes it, allowing us to produce images of unprecedented quality, e.g., CelebA images at 1024^2. We also propose a simple way to increase the variation in generated images, and achieve a record inception score of 8.80 in unsupervised CIFAR10. Additionally, we describe several implementation details that are important for discouraging unhealthy competition between the generator and discriminator. Finally, we suggest a new metric for evaluating GAN results, both in terms of image quality and variation. As an additional contribution, we construct a higher-quality version of the CelebA dataset.

PUBLICATION RECORD

  • Publication year

    2017

  • Venue

    International Conference on Learning Representations

  • Publication date

    2017-10-27

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-55 of 55 references · Page 1 of 1

CITED BY

Showing 1-100 of 8294 citing papers · Page 1 of 83