Associative memories, a classical model for brain long-term memory, face interferences between old and new memories. Usually, the only remedy is to enlarge the network so as to retain more memories without collisions: this is the network's size--diversity trade-off. We propose a novel way of representing data in these networks to provide another mean to extend diversity without resizing the network. We show from our analysis and simulations that this method is a viable alternative, which can perfectly fit cases where network's size is constrained, such as neuromorphic FPGA boards implementing associative memories.
Hire 💕 9907093804 Hooghly Call Girls Service Call Girls Agency
Tagged network (colored clique network) COGNITIVE 2015 by Stephen Larroque
1. Using tags to
Improve Diversity of
Sparse Associative Memories
Stephen Larroque
with Ehsan Sedgh Gooya, Vincent Gripon, Dominique Pastor
An alternative to the size-diversity trade-off for
neuromorphic devices
23rd March 2015
COGNITIVE 2015
23. Let’s color this graph!
23
200
000
304
102
000
403
103
000
400
Part 1 2 3
1
2
3
... BGW ...
Unit
B
G
W
...
sym
m
etry
Adjacency matrix
24. Let’s color this graph!
24
200
000
304
102
000
403
103
000
400
Part 1 2 3
1
2
3
... BGW ...
Unit
B
G
W
...
sym
m
etry
Adjacency matrix
25. Let’s color this graph!
26
200
000
304
102
000
403
103
000
400
Part 1 2 3
1
2
3
... BGW ...
Unit
B
G
W
...
sym
m
etry
Adjacency matrix
Fake memory avoided by tag disambiguation
26. Performance
27
• Diversity/density for various tags limits against the error rate :
χ = 16, c = 8, L = 64, erasure rate = 0,5 (half of the query is erased)
27. Conclusion and future works
• Tags : a viable alternative to the diversity-size trade-
off for fixed-size networks
(e.g., neuromorphic devices)
• A tentative explanation of synapses heterogeneity :
brain may use an affinity system to co-sustain
synapses with similar parameters.
« Neurons that fire together, wire together, and with a strong affinity. »
• Next :
– Noisy scenario (unreliable tags)
– pertinence of memories, variable resiliency : all items may
not need to be stored with equal resiliency. Try to refresh
tags on access ? (« spacing effect » ?) 28
28. – Nonholographic associative memory, D. J. Willshaw, O. P. Buneman, and H. C. Longuet-Higgins,
Nature, vol. 222(5197), June 1969, pp. 960–962
– Neural networks and physical systems with emergent collective computational abilities, J. J.
Hopfield, Proceedings of the national academy of sciences, vol. 79, no. 8, 1982, pp. 2554–2558.
– Sparse neural networks with large learning diversity, V. Gripon and C. Berrou, Neural Networks,
IEEE Transactions on, vol. 22, no. 7, 2011, pp. 1087–1096.
Thanks and a few references
slideshare.net/LRQ3000
Vincent GRIPON Dominique PASTOR
ERC grant agreement n° 290901
Ehsan
SEDGH GOOYA
31. Neural constraints
• Energetic parsimony
• Material resources parsimony
• Noise robustness
• Simple processing rules (analog?)
=> Feed-forwoard ANNs : synaptic weights as
floats are too sensitive
(learning = adjust weights)
and capacity ~ sub-linear in number of nodes.
35
(Aiello & Wheeler, 1995)
32. Cliques network
• Brain = information encoder
• Fully graphical model
• Associative, recurrent network with binary weights,
integer output, capacity ~ n² :
– Network : set of clusters
– Cluster : set of fanals
– Fanal : graph nodes
(microcolumn?)
36
(Gripon-Berrou Neural Network, 2011)
33. Capacity vs Diversity
• Diversity = number of messages possibly
leant/stored
• Capacity = whole learnt information
• « From a cognitive point of view, it is better to
learn (and possibly combine) 1000 messages
of 10 characters than to learn 10 messages of
1000 characters »
37
(C. Berrou & V. Gripon, 2010)
43. Analysis of error rate
47
• New error type : lost unit error
Lost for
Red clique !
44. Analysis of error rate
48
• New error type : lost unit error
=> When a clique lose one node, because all edges
have been overwritten by other tags of newer cliques, it
becomes unretrievable.
=> only dependent on storage process !
Lost for
Red clique !
45. Theoretical lost unit error
49
• With :
• Approximation : messages are i.i.d. variables
M = total messages ; c = clique order
χ = total graph parts ; L = units per part
Proba to overwrite one edge
when storing one new clique
(1 chance over network’s size)
Clique size
(repeat for all edges of each new message)
(need to
overwrite all
edges of one unit
to lose it)
46. What about other error types?
50
• Real error rate (red) against a composition of errors from
each possible type (green) : lost unit error is a good predictor
47. What about other error types?
51
• Theoretical lost unit error (black) against real (blue)
48. Efficiency?
52
• Efficiency = B (amount of info stored)
Q (material used)
• Clique network :
• Tagged network :
=> Tagged network use more material,
proportionally to the number of tags !