Neurocomputing: Fundamentals of Computational Neuroscience (Fall 2003)
Assignment 3 due November 20 in class (14 points)
1. Network Construction (4)
a) Visualize a recurrent network with the following weight matrix
|
( |
0 |
3 |
2 |
) |
w = |
1 |
0 |
1 |
||
|
2 |
3 |
4 |
b) Can this weight matrix result from Hebbian learning? Explain!
2. Hebbian Learning (6)
Given are the pattern x1 = (1, 1, 1, 0)’ and x2 = (0, 1, 1, 1)’ that we want to imprint into an attractor network with a Hebbian learning rule.
The network uses binary units with the threshold-linear activation (or gain) function
|
{ |
0 if x ≤ 0 |
g(x)= |
|
|
|
1 otherwise |
a) Give the weight matrix after learning.
b) Show that the training patterns are memory states under the network dynamics.
c) Which memory state is retrieved with the initial pattern (1, 1, 1, 1)’?
3. Reciprocal pattern (4)
Given is an auto-associative recurrent attractor network that is trained on one binary pattern with the Hebbian auto-association rule? Show that the inverse pattern is a stationary state (attractor) of the network.