USING THE GNG-M ALGORITHM TO DEAL WITH THE PROBLEM OF CATASTROPHIC FORGETTING IN INCREMENTAL MODELLING

Héctor F. Satizábal M., Andres Perez-Uriibe

2011

Abstract

Creating computational models from large and growing datasets is an important issue in current machine learning research, because most modelling approaches can require prohibitive computational resources. This work presents the use of incremental learning algorithms within the framework of an incremental modelling approach. In particular, it presents the GNG-m algorithm, an adaptation of the Growing Neural Gas algorithm (GNG), capable of circumventing the problem of catastrophic forgetting when modelling large datasets in a sequential manner. We illustrate this by comparing the performance of GNG-m with that of the original GNG algorithm, on a vector quantization task. Last but not least, we present the use of GNG-m in an incremental modelling task using a real-world database of temperature, coming from a geographic information system (GIS). The dataset of more than one million multidimensional observations is split in seven parts and then reduced by vector quantization to a codebook of only thousands of prototypes.

References

  1. Bouchachia, A., Gabrys, B., and Sahel, Z. (2007). Overview of some incremental learning algorithms. In Proceedings of the Fuzzy Systems Conference, 2007. FUZZ-IEEE 2007, pages 1-6.
  2. Carpenter, G. A. and Grossberg, S. (1987). Art 2: selforganization of stable category recognition codes for analog input patterns. Appl. Opt., 26(23):4919-4930.
  3. Chalup, S. K. (2002). Incremental learning in biological and machine learning systems. Int. J. Neural Syst., 12(6):447-465.
  4. French, R. M. (1994). Catastrophic forgetting in connectionist networks: Causes, consequences and solutions. In Trends in Cognitive Sciences, pages 128-135.
  5. Fritzke, B. (1995). A growing neural gas network learns topologies. In Advances in Neural Information Processing Systems 7, pages 625-632. MIT Press.
  6. Fritzke, B. (1997). Unsupervised ontogenic networks. In Handbook of Neural Computation, chapter C 2.4. Institute of Physics and Oxford University Press.
  7. Giraud-Carrier, C. (2000). A note on the utility of incremental learning. Aicommunications, 13(4):215-223(9).
  8. Grossberg, S. (1987). Competitive learning: From interactive activation to adaptive resonance. Cognitive Science, 11(1):23 - 63.
  9. Heinke, D. and Hamker, F. H. (1998). Comparing neural networks: a benchmark on growing neural gas, growing cell structures, and fuzzy ARTMAP. IEEE Transactions on Neural Networks, 9(6):1279-1291.
  10. Hijmans, R. J., Cameron, S. E., Parra, J. L., Jones, P. G., and Jarvis, A. (2005). Very high resolution interpolated climate surfaces for global land areas. International Journal of Climatology, 25(15):1965-1978.
  11. McCloskey, M. and Cohen, N. J. (1989). Catastrophic interference in connectionist networks: The sequential learning problem. In Bower, G. H., editor, The Psychology of Learning and Motivation: Advances in Research and Theory, volume 24, pages 109-136. Academic Press inc.
  12. Robins, A. (2004). Sequential learning in neural networks: A review and a discussion of pseudorehearsal based methods. Intell. Data Anal., 8(3):301-322.
  13. Sarle, W. S. (2002). comp.ai.neural-nets faq - part 2. Available: http://www.faqs.org/faqs/ai-faq/neural-nets/part2/.
  14. Satizábal, H. F., Perez-Uribe, A., and Tomassini, M. (2009). Avoiding prototype proliferation in incremental vector quantization of large heterogeneous datasets. In Constructive Neural Networks, pages 243-260. Springer Berlin / Heidelberg.
Download


Paper Citation


in Harvard Style

F. Satizábal M. H. and Perez-Uriibe A. (2011). USING THE GNG-M ALGORITHM TO DEAL WITH THE PROBLEM OF CATASTROPHIC FORGETTING IN INCREMENTAL MODELLING . In Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011) ISBN 978-989-8425-84-3, pages 267-276. DOI: 10.5220/0003683702670276


in Bibtex Style

@conference{ncta11,
author={Héctor F. Satizábal M. and Andres Perez-Uriibe},
title={USING THE GNG-M ALGORITHM TO DEAL WITH THE PROBLEM OF CATASTROPHIC FORGETTING IN INCREMENTAL MODELLING},
booktitle={Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)},
year={2011},
pages={267-276},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003683702670276},
isbn={978-989-8425-84-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)
TI - USING THE GNG-M ALGORITHM TO DEAL WITH THE PROBLEM OF CATASTROPHIC FORGETTING IN INCREMENTAL MODELLING
SN - 978-989-8425-84-3
AU - F. Satizábal M. H.
AU - Perez-Uriibe A.
PY - 2011
SP - 267
EP - 276
DO - 10.5220/0003683702670276