Temporal Convolution Machines for Sequence Learning (2009)
The Temporal Convolution Machine (TCM) is a neural architecture for learning temporal sequences that generalizes the Temporal Restricted Boltzmann Machine (TRBM). A convolution function is used to provide a trainable envelope of time sensitivity in the bias terms. Gaussian and multi-Gaussian envelopes with trainable means and variances are evaluated as particular instances of the TCM architecture. First, Gaussian and multi-Gaussian TCMs are shown to learn a class of multi-modal distributions over synthetic binary spatiotemporal data better than comparable TRBM models. Second, these networks are trained to recall digitized versions of baroque sonatas. In this task, a multi-Gaussian TCM performs effective sequence mapping when the input sequence is partially hidden. The TCM is therefore a promising approach to learning more complex temporal data than was previously possible.
View:
PDF
Citation:
Technical Report AI-09-04, Department of Computer Sciences, the University of Texas at Austin.
Bibtex:

Alan J. Lockett Ph.D. Alumni alan lockett [at] gmail com
Risto Miikkulainen Faculty risto [at] cs utexas edu