Extension of Backpropagation through Time for Segmented-memory Recurrent Neural Networks

Stefan Glüge, Ronald Böck, Andreas Wendemuth

2012

Abstract

We introduce an extended Backpropagation Through Time (eBPTT) learning algorithm for Segmented-Memory Recurrent Neural Networks. The algorithm was compared to an extension of the Real-Time Recurrent Learning algorithm (eRTRL) for these kind of networks. Using the information latching problem as benchmark task, the algorithms’ ability to cope with the learning of long-term dependencies was tested. eRTRL was generally better able to cope with the latching of information over longer periods of time. On the other hand, eBPTT guaranteed a better generalisation when training was successful. Further, due to its computational complexity, eRTRL becomes impractical with increasing network size, making eBPTT the only viable choice in these cases.

References

  1. Bengio, Y., Simard, P., and Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks, 5(2):157-166.
  2. Chen, J. and Chaudhari, N. S. (2009). Segmented-memory recurrent neural networks. IEEE Transactions Neural Networks, 20(8):1267-1280.
  3. Elman, J. L. (1990). Finding structure in time. Cognitive Science, 14(2):179-211.
  4. Glüge, S., Böck, R., and Wendemuth, A. (2011). Segmented-memory recurrent neural networks versus hidden markov models in emotion recognition from speech. In International Conference on Neural Computation Theory and Applications (NCTA 2011), pages 308-315, Paris.
  5. Werbos, P. (1990). Backpropagation through time: what it does and how to do it. Proceedings of the IEEE, 78(10):1550 -1560.
  6. Williams, R. J. and Zipser, D. (1989). A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 1(2):270-280.
  7. Williams, R. J. and Zipser, D. (1995). Gradient-based learning algorithms for recurrent networks and their computational complexity, pages 433-486. L. Erlbaum Associates Inc., Hillsdale, NJ, USA.
Download


Paper Citation


in Harvard Style

Glüge S., Böck R. and Wendemuth A. (2012). Extension of Backpropagation through Time for Segmented-memory Recurrent Neural Networks . In Proceedings of the 4th International Joint Conference on Computational Intelligence - Volume 1: NCTA, (IJCCI 2012) ISBN 978-989-8565-33-4, pages 451-456. DOI: 10.5220/0004103804510456


in Bibtex Style

@conference{ncta12,
author={Stefan Glüge and Ronald Böck and Andreas Wendemuth},
title={Extension of Backpropagation through Time for Segmented-memory Recurrent Neural Networks},
booktitle={Proceedings of the 4th International Joint Conference on Computational Intelligence - Volume 1: NCTA, (IJCCI 2012)},
year={2012},
pages={451-456},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004103804510456},
isbn={978-989-8565-33-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 4th International Joint Conference on Computational Intelligence - Volume 1: NCTA, (IJCCI 2012)
TI - Extension of Backpropagation through Time for Segmented-memory Recurrent Neural Networks
SN - 978-989-8565-33-4
AU - Glüge S.
AU - Böck R.
AU - Wendemuth A.
PY - 2012
SP - 451
EP - 456
DO - 10.5220/0004103804510456