In this paper, we explore the inclusion of latent random variables into the hidden state of a recurrent neural network (RNN) by combining the elements of the variational autoencoder. We argue that through the use of high-level latent random variables, the variational RNN (VRNN)1 can model the kind of variability observed in highly structured sequential data such as natural speech. We empirically evaluate the proposed model against other related sequential models on four speech datasets and one handwriting dataset. Our results show the important roles that latent random variables can play in the RNN dynamics.
A Recurrent Latent Variable Model for Sequential Data
Junyoung Chung,Kyle Kastner,Laurent Dinh,Kratarth Goel,Aaron C. Courville,Yoshua Bengio
Published 2015 in Neural Information Processing Systems
ABSTRACT
PUBLICATION RECORD
- Publication year
2015
- Venue
Neural Information Processing Systems
- Publication date
2015-06-07
- Fields of study
Computer Science
- Identifiers
- External record
- Source metadata
Semantic Scholar
CITATION MAP
EXTRACTION MAP
CLAIMS
- No claims are published for this paper.
CONCEPTS
- No concepts are published for this paper.
REFERENCES
Showing 1-18 of 18 references · Page 1 of 1