Authors:
Igor Peric
;
Alexandru Lesi
;
Daniel Spies
;
Stefan Ulbrich
;
Arne Roennau
;
Marius Zoellner
and
Ruediger Dillman
Affiliation:
FZI Forschungszentrum Informatik, Germany
Keyword(s):
Vector Symbolic Architectures, Associative Memories, Symbol Encoding, Symbolic Scripting.
Related
Ontology
Subjects/Areas/Topics:
Health Engineering and Technology Applications
;
Information Processing
;
Learning Systems and Memory
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
Abstract:
Vector Symbolic Architectures (VSAs) define a set of operations for association, storage, manipulation and retrieval of symbolic concepts, represented as fixed-length vectors in IRn. A specific instance of VSAs, Holo- graphic Reduced Representations (HRRs), have proven to exhibit properties similar to human short-term mem- ory and as such are interesting for computational modelling. In this paper we extend the HRR approach by introducing implicit, topology-preserving encoding and decoding procedures. We propose to replace unique symbolic representations with symbols based on probability density functions. These symbols must be ran- domly permuted to ensure the uniform distribution of signals across Fourier space where embedding takes place. These novel encoding schemes eliminate the need for so-called clean-up modules after memory re- trieval (e.g., self-organizing maps). Effectively each encoding implicitly represents its scalar symbol, so no further lookup is needed. We further show
that our encoding scheme has a positive impact on memory capacity in comparison to the original capacity benchmark for HRRs (Plate, 1995). We also evaluate our memories in two different robotics tasks: visual scene memory and state machine scripting (holographic controllers).
(More)