Learning
Sequential
Tasks by Incrementally Adding Higher Orders, from Advances in
Neural
Information Processing Systems 5 (NIPS5), 1992.
An incremental, higher-order, non-recurrent network combines two
properties found to be useful for learning sequential tasks:
higher-order connections and incremental introduction of new
units. The network adds higher orders when needed by adding new
units that dynamically modify connection weights. Since the new
units modify the weights at the next time step with information from
the previous step, temporal tasks can be learned without the use of
feedback, thereby greatly simplifying training. Furthermore, a
theoretically unlimited number of units can be added to reach into the
arbitrarily distant past. Experiments with the Reber grammar have
demonstrated speedups of two orders of magnitude over recurrent
networks.