Authors:
Stefan Glüge
;
Ronald Böck
and
Andreas Wendemuth
Affiliation:
Otto von Guericke University Magdeburg, Germany
Keyword(s):
Implicit sequence learning, Temporal order in associative learning, Elman network, Simple recurrent network.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Computational Intelligence
;
Data Manipulation
;
Health Engineering and Technology Applications
;
Human-Computer Interaction
;
Learning Paradigms and Algorithms
;
Methodologies and Methods
;
Neural Networks
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
;
Pattern Recognition
;
Physiological Computing Systems
;
Sensor Networks
;
Signal Processing
;
Soft Computing
;
Theory and Methods
Abstract:
Without any doubt the temporal order inherent in a task is an important issue during human learning. Recurrent neural networks are known to be a useful tool to model implicit sequence learning. In terms of the psychology of learning, recurrent networks might be suitable to build a model to reproduce the data obtained from experiments with human subjects. Such model should not just reproduce the data but also explain it and further make verifiable predictions. Therefore, one basic requirement is an understanding of the processes in the network during learning. In this paper, we investigate how (implicitly learned) temporal information is stored/represented in a simple recurrent network. To be able to study detailed effects we use a small network and a standard encoding task for this study.