DVAE#: Discrete Variational Autoencoders with Relaxed Boltzmann Priors

Arash Vahdat,E. Andriyash,W. Macready

Published 2018 in Neural Information Processing Systems

ABSTRACT

Boltzmann machines are powerful distributions that have been shown to be an effective prior over binary latent variables in variational autoencoders (VAEs). However, previous methods for training discrete VAEs have used the evidence lower bound and not the tighter importance-weighted bound. We propose two approaches for relaxing Boltzmann machines to continuous distributions that permit training with importance-weighted bounds. These relaxations are based on generalized overlapping transformations and the Gaussian integral trick. Experiments on the MNIST and OMNIGLOT datasets show that these relaxations outperform previous discrete VAEs with Boltzmann priors. An implementation which reproduces these results is available at this https URL .

PUBLICATION RECORD

  • Publication year

    2018

  • Venue

    Neural Information Processing Systems

  • Publication date

    2018-05-01

  • Fields of study

    Mathematics, Computer Science

  • Identifiers
  • External record

    Open on Semantic Scholar

  • Source metadata

    Semantic Scholar

CITATION MAP

EXTRACTION MAP

CLAIMS

  • No claims are published for this paper.

CONCEPTS

  • No concepts are published for this paper.

REFERENCES

Showing 1-49 of 49 references · Page 1 of 1

CITED BY

Showing 1-54 of 54 citing papers · Page 1 of 1