The document summarizes research on modeling multiple sequence processing using an unsupervised neural network approach based on the Hypermap Model. Key points:
- The researcher extends previous models to handle complex sequences with repeating subsequences and multiple sequences occurring together without interference.
- Modifications include incorporating short-term memory to dynamically encode time-varying sequence context and inhibitory links to enable competitive queuing during recall.
- Experimental evaluation shows the network can correctly recall sequences using partial context and when sequences overlap.
- Future work aims to model the transition from single-word to two-word child speech and incorporate temporal processing of multimodal inputs like gestures.
1. A Hypermap Model for Multiple Sequence Processing Abel Nyamapfene 30 April 2007
2. Research Motivation I am investigating complex sequence processing and Multiple Sequence Processing using an Unsupervised Neural Network processing paradigm based on the Hypermap Model by Kohonen
3.
4.
5.
6.
7.
8. Barreto-Araujo Extended Hypermap Model(1) a(t-1), y(t-1) a(t), y(t) Lateral Weights M Feedforward Weights W z -1 z -1 z -1 z -1 z --1 context Sensory stimuli
9.
10. Barreto-Araujo Extended Hypermap Model(2) z -1 z -1 z -1 z -1 Fixed context Time-varying context Sensorimotor stimuli a(t-1), y(t-1) a(t), y(t) Lateral Weights M Feedforward Weights W z --1
11.
12.
13. Temporal Hypermap Neuron D j 0 D j 1 D j 2 D j (d-1 ) D j+1 0 D j+1 1 D j+2 0 D j+1 (d-2) D j+2 (d-3) D j+d-1 0 D j+d-2 1 (j-1) th Neuron ( j+1) th Neuron Pattern Vector Context Vector Threshold unit Delay units Hebbian Link Hebbian Link Inhibitory Links
14.
15.
16.
17.
18. Experimental Evaluation 1: Evaluation Data Artificially Intelligent Neural Networks II 6 Supervised Learning 11 Probabilistic Neural Networks and Radial Basis Functions 5 Applications of Neural Networks to Power Systems 10 Hybrid Systems III 4 Image Processing 9 Pattern Recognition II 3 Neural Systems Hardware 8 Intelligent Control II 2 Time Series Prediction 7 Learning and Memory II 1 Sequence No Sequence No
19. Network correctly recalls sequences through Context and when partial sequences applied to the network Hybrid Systems III Hybrid Artificially Intelligent Neural Networks II Artificially cognition II cog Neural Networks and Radial Basis Functions Neural Networks and No CHOICE due to conflict between sequences 2 and 6 Intelligent Series Prediction Series Time Series Prediction Time Processing Proc No CHOICE due to conflict between sequences 5 and 10 Pro Radial Basis Functions Radial Learning and Memory I1 Learning and Recalled Sequence Partial Sequence
20. Case Study: Two-Word Child Language “ there cookie” instead of “there is a cookie” “ more juice” instead of “Can I have some more juice” “ baby gone” instead of “The baby has disappeared”
21.
22.
23. Perceptual Entity Vector Word Vector Conceptual Relation Vector Inhibitory Link Z -1 Z -1 (j-1) th Neuron j th Neuron (j+1) th Neuron Threshold Logic Unit Delay Line Element Temporal Hypermap Segment
24.
25. Simulating Transition from One-Word to Two-Word Speech From Saying “ cookie” “ more” “ down” To Saying: “ there cookie” “ more juice” “ sit down”
26.
27.
28. One-Word to Two-Word Model Static CP network Temporal Hypermap Inhibitory Link Word vector Perceptual entity vector Conceptual Relationship vector z -1 z -1
30. Future Work: Majority of multimodal models of early child language are static: Image (input) Image (output) Label (output) Label (input) Plunkett et al model, 1992 perceptual input Conceptual Relation input One Word Output Nyamapfene & Ahmad, 2007
31.
32. We Will Make These Modifications D j 0 D j 1 D j 2 (j-1) th Neuron ( j+1) th Neuron Pattern Vector Context Vector Threshold unit Delay units Hebbian Link Hebbian Link To include Pattern Items from other sequences 1 To Include Feedback from Concurrent Sequences 2