Neurocomputing: Fundamentals of Computational Neuroscience (Fall 2003)

 

Assignment 2 due October 16 in class (20 points)

 

1. Write a program that implements a mapping network (perceptron) with a single layer (not counting the input layer). Train the network on the pattern (letter) recognition task of tutorial 1. Each letter should be represented at the output layer with a single active output node, one for each of the 26 letters (local coding). Train the network using error-correction learning with an appropriate delta rule (e.g. Widrow-Hoff).

a. Plot the learning curve in which you show the performance of the network (error) versus the training steps.

b. Evaluate the robustness of the network in recognizing noisy versions of the letter patterns. Plot a curve that shows the average recognition rate versus the noise level. The plot should include errorbars!

2. Repeat the experiments for a multilayer perceptron trained with the generalized delta rule (error-backpropagation).

3. Compare the results to the methods you used for tutorial 1.