PROBLEMS AND FEATURES OF EVOLUTIONARY ALGORITHMS TO BUILD HYBRID TRAINING METHODS FOR RECURRENT NEURAL NETWORKS

M. P. Cuéllar, M. Delgado, M. C. Pegalajar

2007

Abstract

Dynamical recurrent neural networks are models suitable to solve problems where the input and output data may have dependencies in time, like grammatical inference or time series prediction. However, traditional training algorithms for these networks sometimes provide unsuitable results because of the vanishing gradient problems. This work focuses on hybrid proposals of training algorithms for this type of neural networks. The methods studied are based on the combination of heuristic procedures with gradient-based algorithms. In the experimental section, we show the advantages and disadvantages that we may find when using these training techniques in time series prediction problems, and provide a general discussion about the problems and cases of different hybridations based on genetic evolutionary algorithms.

References

  1. Bengio, Y., Simard, P., and Frasconi, P. (1994). Learning long-term dependencies with gradient descent is difficult. IEEE Trans. on Neural Networks, 5(2):157-166.
  2. Blanco, A., Delgado, M., and Pegalajar, M. C. (2001). A real-coded genetic algorithm for training recurrent neural networks. Neural Networks, 14:93-105.
  3. Byrd, R., Lu, P., and Nocedal, J. (1995). A limited memory algorithm for bound constrained optimizetion. SIAM Journal on Scientific and Statistical Computing, 16(5):1190-1208.
  4. Cuéllar, M., Delgado, M., and Pegalajar, M. (2005). An application of non-linear programming to train recurrent neural networks in time series prediction problems. In Proc. of International Conference on Enterprise and Information Systems (ICEIS'05), pages 35-42, Miami (USA).
  5. Cuéllar, M., Delgado, M., and Pegalajar, M. (2006). Memetic evolutionary training for recurrent neural networks: an application to time-series prediction. Expert Systems, 23(2):99-117.
  6. Elman, J. (1990). Finding structure in time. Cognitive Science, 14:179-211.
  7. Haykin, S. (1999). Neural Networks: A Comprehensive Foundation. Prentice Hall.
  8. Ku, K. and Mak, M. (1997). Exploring the effects of lamarckian and baldwinian learning in evolving recurrent neural networks. In Proc. of the IEEE International Conference on Evolutionary Computation.
  9. Laguna, M. and Martí, R. (2003). Scatter Search. Methodology and Implementations in C. Kluwer Academic Publishers.
  10. Lendasse, A., Oja, E., Simula, O., and Verleysen, M. (2004). Time series competition: The cats benchmark. In Proc. International Joint Conference on Neural Networks (IJCNN'04), pages 1615-1620, Budapest (Hungary).
  11. Mandic, D. and Chambers, J. (2001). Recurrent Neural Networks for Prediction. John Wiley and sons.
  12. Moscato, P. and Porras, C. C. (2003). An introduction to memetic algorithms. Inteligencia Artificial (Special Issue on Metaheuristics), 2(19):131-148.
  13. Prudencio, R. and Ludermir, T. (2003). Neural network hybrid learning: Genetic algorithms and levenbergmarquardt. In Proc. 26th Annual Conference of the Gesellschaft fr Classifikation, pages 464-472.
  14. Zhu, C., Byrd, R., and Nocedal, J. (1997). L-bfgs-b. algoritmo 778: L-bfgs-b, fortran routines for large scale bound constrained optimization. ACM Transactions on Mathematical Software, 23(4):550-560.
Download


Paper Citation


in Harvard Style

P. Cuéllar M., Delgado M. and C. Pegalajar M. (2007). PROBLEMS AND FEATURES OF EVOLUTIONARY ALGORITHMS TO BUILD HYBRID TRAINING METHODS FOR RECURRENT NEURAL NETWORKS . In Proceedings of the Ninth International Conference on Enterprise Information Systems - Volume 2: ICEIS, ISBN 978-972-8865-89-4, pages 204-211. DOI: 10.5220/0002383502040211


in Bibtex Style

@conference{iceis07,
author={M. P. Cuéllar and M. Delgado and M. C. Pegalajar},
title={PROBLEMS AND FEATURES OF EVOLUTIONARY ALGORITHMS TO BUILD HYBRID TRAINING METHODS FOR RECURRENT NEURAL NETWORKS},
booktitle={Proceedings of the Ninth International Conference on Enterprise Information Systems - Volume 2: ICEIS,},
year={2007},
pages={204-211},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002383502040211},
isbn={978-972-8865-89-4},
}


in EndNote Style

TY - CONF
JO - Proceedings of the Ninth International Conference on Enterprise Information Systems - Volume 2: ICEIS,
TI - PROBLEMS AND FEATURES OF EVOLUTIONARY ALGORITHMS TO BUILD HYBRID TRAINING METHODS FOR RECURRENT NEURAL NETWORKS
SN - 978-972-8865-89-4
AU - P. Cuéllar M.
AU - Delgado M.
AU - C. Pegalajar M.
PY - 2007
SP - 204
EP - 211
DO - 10.5220/0002383502040211