Additional exercises for Fundamentals of Computational Neuroscience as a warm-up for the final

 

 

  1. Network Construction

 

a) Visualize a feedforward network and a recurrent network with the following weight matrix

 

 

(

2

4

3

)

w =

1

1

0

 

3

0

2

                             

 

 

 

 

  1. Function approximation with multilayer perceptrons

 

Can a multiplayer feedforward mapping network approximate an exponential function?

 

  1. Boolean function

 

A certain Boolean function is given by the following table:

 

x1

x2

x3

y1

y2

1

1

0

1

1

1

0

0

0

1

0

1

0

1

0

0

0

1

1

1

 

 

 

 

 

 

where x1, x2, and x3 represent the input and y1 and y2 the outputs.

 

a)      Is this function linear separable?

b)      Draw a principle network architecture that can implement this function.

c)      Specify the weights and thresholds to represent the above function

 

 

  1. Supervised learning

 

Specify a supervised learning rule for feedforward mapping networks that does not use error backpropagation.

 

  1. Hebbian Learning

 

Given are the pattern x1 = (1, 1, -2)’ and x2 = (-1.5, 2, -0.5)’ , which we want to imprint into an attractor network with Hebbian learning. Give the weight matrix after learning.