Patterns storage and recall in quantum associative memories
2006, MATHMOD'2006: 5th IMACS International Symposium on Mathematical Modeling, Vienna,Austria
Sign up for access to the world's latest research
Abstract
Quantum associative memories are derived from the Hopfield memory model assuming that the elements of the weight matrix W are stochastic variables which are calculated from the solution of the Schrodinger's diffusion equation. Simulation results are provided to study the storage and recall of patterns in quantum associative memories.
Related papers
IOP Conference Series: Materials Science and Engineering, 2021
In this research paper storage as well as retrieval of 1-D/2-D/3-D information using Hopfield type Associative Memories (AMs) is discussed. Various Artificial Neural Networks (ANN) architectures are proposed. Also, implementational issues associated with those Associative Memories are discussed. Cascade connection of AM and Convolutional Neural Network is proposed for noise immunity.
The 2012 International Joint Conference on Neural Networks (IJCNN), 2012
As computers approach the physical limits of information storable in memory, new methods will be needed to further improve information storage and retrieval. We propose a quantum inspired vector based approach, which offers a contextually dependent mapping from the subsymbolic to the symbolic representations of information. If implemented computationally, this approach would provide exceptionally high density of information storage, without the traditionally required physical increase in storage capacity. The approach is inspired by the structure of human memory and incorporates elements of Gärdenfors' Conceptual Space approach and Humphreys et al.'s matrix model of memory. Kitto, K., Bruza, P., & Gabora, L. (2012). A quantum information retrieval approach to memory. Proc International Joint Conf on Neural Networks, (pp. 932-939). June 10-15, Brisbane, Australia, IEEE Computational Intelligence Soc.
Cognitive Science, 2016
Recent evidence suggests that experienced events are often mapped to too many episodic states, including those that are logically or experimentally incompatible with one another. For example, episodic over-distribution patterns show that the probability of accepting an item under different mutually exclusive conditions violates the disjunction rule. A related example, called subadditivity, occurs when the probability of accepting an item under mutually exclusive and exhaustive instruction conditions sums to a number >1. Both the over-distribution effect and subadditivity have been widely observed in item and source-memory paradigms. These phenomena are difficult to explain using standard memory frameworks, such as signal-detection theory. A dual-trace model called the over-distribution (OD) model (Brainerd & Reyna, 2008) can explain the episodic over-distribution effect, but not subadditivity. Our goal is to develop a model that can explain both effects. In this paper, we propose the Generalized Quantum Episodic Memory (GQEM) model, which extends the Quantum Episodic Memory (QEM) model developed by Brainerd, Wang, and Reyna (2013). We test GQEM by comparing it to the OD model using data from a novel item-memory experiment and a previously published source-memory experiment (Kellen, Singmann, & Klauer, 2014) examining the over-distribution effect. Using the best-fit parameters from the over-distribution experiments, we conclude by showing that the GQEM model can also account for subadditivity. Overall these results add to a growing body of evidence suggesting that quantum probability theory is a valuable tool in modeling recognition memory.
Fortschritte der Physik, 2016
The model of quantum associative memories proposed here is quietly similar to that of Rigui Zhou et al.
The European Physical …, 2010
We perform a review of various approaches to the implementation of quantum memories, with an emphasis on activities within the quantum memory sub-project of the EU Integrated Project "Qubit Applications". We begin with a brief overview over different applications for quantum memories and different types of quantum memories. We discuss the most important criteria for assessing quantum memory performance and the most important physical requirements. Then we review the different approaches represented in "Qubit Applications" in some detail. They include solid-state atomic ensembles, NV centers, quantum dots, single atoms, atomic gases and optical phonons in diamond. We compare the different approaches using the discussed criteria. PACS. 03.67.-a Quantum information -03.67.Hk Quantum communication -03.67.Lx Quantum computation architectures and implementations -42.50.Ct Quantum description of interaction of light and matter; related experiments -42.50.Md Optical transient phenomena: quantum beats, photon echo, freeinduction decay, dephasings and revivals, optical nutation, and self-induced transparency
In the neural network theory content-addressable memories are defined by patterns that are attractors of the dynamical rule of the system. This paper develops a quantum neural network starting from a classical neural network Hamiltonian and using a Schr'Odinger-like equation. It then shows that such a system exhibits probabilistic memory storage characteristics analogous to those of the dynamical attractors of classical systems.
2018
In source memory studies, a decision-maker is concerned with identifying the context in which a given episodic experience occurred. A common paradigm for studying source memory is the `three-list' experimental paradigm, where a subject studies three lists of words and is later asked whether a given word appeared on one or more of the studied lists. Surprisingly, the sum total of the acceptance probabilities generated by asking for the source of a word separately for each list (`list 1?', `list 2?', `list 3?') exceeds the acceptance probability generated by asking whether that word occurred on the union of the lists (`list 1 or 2 or 3?'). The episodic memory for a given word therefore appears over distributed on the disjoint contexts of the lists. A quantum episodic memory model [QEM] was proposed by Brainerd, Wang and Reyna (2013) to explain this type of result. In this paper, we apply a Hamiltonian dynamical extension of QEM for over distribution of source memor...
IEEE Transactions on Information Theory, 1987
Techniques from coding theory are applied to study rigorously the capacity of the Hopfield associative memory. Such a memory storesn-tuple ofpm 1's. The components change depending on a hard-limited version of linear functions of all other components. With symmetric connections between components, a stable state is ultimately reached. By building up the connection matrix as a sum-of-outer products ofmfundamental memories, one hopes to be able to recover a certain one of themmemories by using an initialn-tuple probe vector less than a Hamming distancen/2away from the fundamental memory. Ifmfundamental memories are chosen at random, the maximum asympotic value ofmin order that most of themoriginal memories are exactly recoverable isn/(2 log n). With the added restriction that every one of themfundamental memories be recoverable exactly,mcan be no more thann/(4 log n)asymptotically asnapproaches infinity. Extensions are also considered, in particular to capacity under quantization of the outer-product connection matrix. This quantized memory capacity problem is closely related to the capacity of the quantized Gaussian channel.
2011 IEEE International Symposium on Information Theory Proceedings, 2011
We consider the problem of neural association, which deals with the retrieval of a previously memorized pattern from its noisy version. The performance of various neural networks developed for this task may be judged in terms of their pattern retrieval capacities (the number of patterns that can be stored), and their error-correction (noise tolerance) capabilities. While significant progress has been made, most prior works in this area show poor performance with regard to pattern retrieval capacity and/or error correction.
2016
Earlier some modifications of the Hebb matrix were proposed to eliminate the memory destruction [3]-[7]. As the result of such modifications an unlimited number of random patterns can be fearlessly written down into matrix elements one by one. However, the memory of the network is restricted. If as previously the maximum number of recognized patterns is denoted by Abstract—We analyzed a Hopfield-like model of artificial memory that reproduces some features of the human memory. They are: a) the ability to absorb new information when working; b) the memorized patterns are only a small part of a set of patterns that are written down in connection matrix; c) the more the pattern was shown during the learning process, the better the quality of its recognition. We used the Hebb rule, but each pattern was supplied with its own weight. The weight

Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.