A Comparison of Learning Rules for Mixed Order Hyper Networks

Kevin Swingler

2015

Abstract

A mixed order hyper network (MOHN) is a neural network in which weights can connect any number of neurons, rather than the usual two. MOHNs can be used as content addressable memories with higher capacity than standard Hopfield networks. They can also be used for regression, clustering, classification, and as fitness models for use in heuristic optimisation. This paper presents a set of methods for estimating the values of the weights in a MOHN from training data. The different methods are compared to each other and to a standard MLP trained by back propagation and found to be faster to train than the MLP and more reliable as the error function does not contain local minima.

References

  1. Beauchamp, K. (1984). Applications of Walsh and Related Functions. Academic Press, London.
  2. Caparrós, G. J., Ruiz, M. A. A., and Hernández, F. S. (2002). Hopfield neural networks for optimization: study of the different dynamics. Neurocomputing, 43(1-4):219-237.
  3. Dobson, A. J. and Barnett, A. (2011). An introduction to generalized linear models. CRC press.
  4. Frean, M. (1990). The upstart algorithm: A method for constructing and training feedforward neural networks. Neural computation, 2(2):198-209.
  5. Hastie, T., Tibshirani, R., Friedman, J., Hastie, T., Friedman, J., and Tibshirani, R. (2009). The elements of statistical learning, volume 2. Springer.
  6. Heckendorn, R. B. and Wright, A. H. (2004). Efficient linkage discovery by limited probing. Evolutionary computation, 12(4):517-545.
  7. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences USA, 79(8):2554-2558.
  8. Hopfield, J. J. and Tank, D. W. (1985). Neural computation of decisions in optimization problems. Biological Cybernetics, 52:141-152.
  9. Kubota, T. (2007). A higher order associative memory with mcculloch-pitts neurons and plastic synapses. In Neural Networks, 2007. IJCNN 2007. International Joint Conference on, pages 1982 -1989.
  10. Pelikan, M., Goldberg, D. E., and Cant ú-paz, E. E. (2000). Linkage problem, distribution estimation, and bayesian networks. Evolutionary Computation, 8(3):311-340.
  11. Shakya, S., McCall, J., Brownlee, A., and Owusu, G. (2012). Deum - distribution estimation using markov networks. In Shakya, S. and Santana, R., editors, Markov Networks in Evolutionary Computation, volume 14 of Adaptation, Learning, and Optimization, pages 55-71. Springer Berlin Heidelberg.
  12. Swingler, K. (2012). On the capacity of Hopfield neural networks as EDAs for solving combinatorial optimisation problems. In Proc. IJCCI (ECTA), pages 152- 157. SciTePress.
  13. Swingler, K. (2014). A walsh analysis of multilayer perceptron function. In Proc. IJCCI (NCTA), pages -.
  14. Swingler, K. and Smith, L. (2014a). Training and making calculations with mixed order hyper-networks. Neurocomputing, (141):65-75.
  15. Swingler, K. and Smith, L. S. (2014b). An analysis of the local optima storage capacity of hopfield network based fitness function models. Transactions on Computational Collective Intelligence XVII, LNCS 8790, pages 248-271.
  16. Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B (Methodological), pages 267-288.
  17. Torres-Moreno, J.-M., Aguilar, J., and Gordon, M. (2002). Finding the number minimum of errors in ndimensional parity problem with a linear perceptron. Neural Processing Letters, 1:201-210.
  18. Torres-Moreno, J.-M. and Gordon, M. B. (1998). Efficient adaptive learning for classification tasks with binary units. Neural Computation, 10(4):1007-1030.
  19. Venkatesh, S. S. and Baldi, P. (1991). Programmed interactions in higher-order neural networks: Maximal capacity. Journal of Complexity, 7(3):316-337.
  20. Walsh, J. (1923). A closed set of normal orthogonal functions. Amer. J. Math, 45:5-24.
  21. Wilson, G. V. and Pawley, G. S. (1988). On the stability of the travelling salesman problem algorithm of hopfield and tank. Biol. Cybern., 58(1):63-70.
Download


Paper Citation


in Harvard Style

Swingler K. (2015). A Comparison of Learning Rules for Mixed Order Hyper Networks . In Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015) ISBN 978-989-758-157-1, pages 17-27. DOI: 10.5220/0005588000170027


in Bibtex Style

@conference{ncta15,
author={Kevin Swingler},
title={A Comparison of Learning Rules for Mixed Order Hyper Networks},
booktitle={Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015)},
year={2015},
pages={17-27},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005588000170027},
isbn={978-989-758-157-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015)
TI - A Comparison of Learning Rules for Mixed Order Hyper Networks
SN - 978-989-758-157-1
AU - Swingler K.
PY - 2015
SP - 17
EP - 27
DO - 10.5220/0005588000170027