BIDIRECTIONAL HIERARCHICAL NEURAL NETWORKS - Hebbian Learning Improves Generalization

Mohammad Saifullah, Rita Kovordanyi, Chandan Roy

2010

Abstract

Visual pattern recognition is a complex problem, and it has proven difficult to achieve satisfactorily in standard three-layer feed-forward artificial neural networks. For this reason, an increasing number of researchers are using networks whose architecture resembles the human visual system. These biologically-based networks are bidirectionally connected, use receptive fields, and have a hierarchical structure, with the input layer being the largest layer, and consecutive layers getting increasingly smaller. These networks are large and complex, and therefore run a risk of getting overfitted during learning, especially if small training sets are used, and if the input patterns are noisy. Many data sets, such as, for example, handwritten characters, are intrinsically noisy. The problem of overfitting is aggravated by the tendency of error-driven learning in large networks to treat all variations in the noisy input as significant. However, there is one way to balance off this tendency to overfit, and that is to use a mixture of learning algorithms. In this study, we ran systematic tests on handwritten character recognition, where we compared generalization performance using a mixture of Hebbian learning and error-driven learning with generalization performance using pure error-driven learning. Our results indicate that injecting even a small amount of Hebbian learning, 0.01 %, significantly improves the generalization performance of the network.

References

  1. Callaway, E. M., 2004. Feedforward, feedback and inhibitory connections in primate visual cortex. Neural Network, 17, 625-632.
  2. Fukushima, K., 1993. Improved generalization ability using constrained neural network architectures, Proceedings of the International Joint Conference on Neural Networks, 2049-2054.
  3. Fukushima, K., 2008. Recent advances in the neocognitron. Neural Information Processing, Lecture Notes In Computer Science, 1041-1050, Berlin/Heidelberg: Springer Verlag.
  4. Hebb, D. O., 1949. The Organization of Behavior: A Neuropsychological Theory. New York: Wiley.
  5. O'Reilly, R. C. and Munakata, Y., 2000. Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain. Cambridge, MA: MIT Press.
  6. McClelland, J., 2005. How far can you go with Hebbian learning, and when does it lead you astray? In Munakata, Y. and Johnson, M.H. (eds) Attention and Performance XXI: Processes of Change in Brain and Cognitive Development. Oxford: Oxford University Press.
  7. O'Reilly, R., 2001. Generalization in interactive networks: The benefits of inhibitory competition and Hebbian learning. Neural Computation, 13, 1199-1241.
  8. Wallis, G. and Rolls, E. T., 1997. Invariant face and object recognition in the visual system. Progress in Neurobiology. 51(2), 167-194.
  9. Aisa, B., Mingus, B. and O'Reilly, R., 2008. The emergent neural modeling system. Neural Networks, 21, 1146- 1152.
Download


Paper Citation


in Harvard Style

Saifullah M., Kovordanyi R. and Roy C. (2010). BIDIRECTIONAL HIERARCHICAL NEURAL NETWORKS - Hebbian Learning Improves Generalization . In Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2010) ISBN 978-989-674-028-3, pages 105-111. DOI: 10.5220/0002835501050111


in Bibtex Style

@conference{visapp10,
author={Mohammad Saifullah and Rita Kovordanyi and Chandan Roy},
title={BIDIRECTIONAL HIERARCHICAL NEURAL NETWORKS - Hebbian Learning Improves Generalization},
booktitle={Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2010)},
year={2010},
pages={105-111},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002835501050111},
isbn={978-989-674-028-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Computer Vision Theory and Applications - Volume 1: VISAPP, (VISIGRAPP 2010)
TI - BIDIRECTIONAL HIERARCHICAL NEURAL NETWORKS - Hebbian Learning Improves Generalization
SN - 978-989-674-028-3
AU - Saifullah M.
AU - Kovordanyi R.
AU - Roy C.
PY - 2010
SP - 105
EP - 111
DO - 10.5220/0002835501050111