KINETIC MORPHOGENESIS OF A MULTILAYER PERCEPTRON

Bruno Apolloni, Simone Bassis, Lorenzo Valerio

2011

Abstract

We introduce a morphogenesis paradigm for a neural network where neurons are allowed to move autonomously in a topological space to reach suitable reciprocal positions under an informative perspective. To this end, a neuron is attracted by the mates which are most informative and repelled by those which are most similar to it. We manage the neuron motion with a Newtonian dynamics in a subspace of a framework where topological coordinates match with those reckoning the neuron connection weights. As a result, we have a synergistic plasticity of the network which is ruled by an extended Lagrangian where physics components merge with the common error terms. With the focus on a multilayer perceptron, this plasticity is operated by an extension of the standard back-propagation algorithm which proves robust even in the case of deep architectures. We use two classic benchmarks to gain some insights on the morphology and plasticity we are proposing.

References

  1. Apolloni, B., Bassis, S., and Brega, A. (2009). Feature selection via Boolean Independent Component analysis. Information Science, 179(22):3815-3831.
  2. Apolloni, B., Bassis, S., Malchiodi, D., and Pedrycz, W. (2008). The Puzzle of Granular Computing, volume 138. Springer Verlag.
  3. Carpenter, G. A. and Grossberg, S. (2003). Adaptive resonance theory. In The Handbook of Brain Theory and Neural Networks, pages 87-90. MIT Press.
  4. Ciresan, D. C., Meier, U., Gambardella, L. M., and Schmidhuber, J. (2010). Deep big simple neural nets for handwritten digit recognition. Neural Computation, 22(12):3207-3220.
  5. Corke, P. I. (1996). A robotics toolbox for Matlab. IEEE Rob. and Aut. Mag., 3(1):24-32.
  6. Danafar, S., Gretton, A., and Schmidhuber, J. (2010). Characteristic kernels on structured domains excel in robotics and human action recognition. In Machine Learning and Knowledge Discovery in Databases, volume 6321, pages 264-279. Springer, Berlin.
  7. Dirac, P. A. M. (1982). The Principles of Quantum Mechanics. Oxford University Press, USA.
  8. Easley, D. and Kleinberg, J. (2010). Networks, Crowds, and Markets: Reasoning About a Highly Connected World. Cambridge University Press, Cambridge, MA.
  9. Ezhov, A. and Ventura, D. (2000). Quantum neural networks. Future Directions for Intelligent Systems and Information Science 2000.
  10. Fermi, E. (1956). Thermodynamics. Dover Publications.
  11. Feynman, R., Leighton, R., and Sands, M. (1963). The Feynman Lectures on Physics, volume 3. AddisonWesley, Boston.
  12. Hinton, G. E., Osindero, S., and Teh, Y. W. (2006). A fast learning algorithm for deep belief nets. Neural Computation, 18:1527-1554.
  13. Kreutz-Delgado, K. and Rao, B. D. (1998). Application of concave/Schur-concave functions to the learning of overcomplete dictionaries and sparse representations. In 32th Asilomar Conference on Signals, Systems & Computers, volume 1, pages 546-550.
  14. Larochelle, H., Bengio, Y., Louradour, J., and Lamblin, P. (2009). Exploring strategies for training deep neural networks. Jour. Machine Learning Research, 10:1-40.
  15. LeCun, Y. (1988). A theoretical framework for backpropagation. In Proc. of the 1988 Connectionist Models Summer School, pages 21-28. Morgan Kaufmann.
  16. LeCun, Y., Bottou, L., Bengio, Y., and Haffner, P. (1998). Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278-2324.
  17. Marín, O. and Lopez-Bendito, G. (2007). Neuronal migration. In Evolution of Nervous Systems: a Comprehensive Reference, chapter 1.1. Academic Press.
  18. Marín, O. and Rubenstein, J. L. (2003). Cell migration in the forebrain. Review in Neurosciences, 26:441-483.
  19. NVIDIA Corporation (2010). Nvidia Tesla c2050 and c2070 computing processors.
  20. Rasmussen, C. E., Neal, R. M., Hinton, G. E., van Camp, D., Revow, M., Ghaharamani, Z., Kustra, R., and Tibshirani, R. (1996). The Delve manual. Technical report, Dept. Computer Science, Univ. of Toronto, Canada. Ver. 1.1.
  21. Stanley, K. O. and Miikkulainen, R. (2002). Evolving neural networks through augmenting topologies. Evolutionary Computation, 10(2):99-127.
Download


Paper Citation


in Harvard Style

Apolloni B., Bassis S. and Valerio L. (2011). KINETIC MORPHOGENESIS OF A MULTILAYER PERCEPTRON . In Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011) ISBN 978-989-8425-84-3, pages 99-105. DOI: 10.5220/0003642800990105


in Bibtex Style

@conference{ncta11,
author={Bruno Apolloni and Simone Bassis and Lorenzo Valerio},
title={KINETIC MORPHOGENESIS OF A MULTILAYER PERCEPTRON},
booktitle={Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)},
year={2011},
pages={99-105},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003642800990105},
isbn={978-989-8425-84-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2011)
TI - KINETIC MORPHOGENESIS OF A MULTILAYER PERCEPTRON
SN - 978-989-8425-84-3
AU - Apolloni B.
AU - Bassis S.
AU - Valerio L.
PY - 2011
SP - 99
EP - 105
DO - 10.5220/0003642800990105