Workshop on

Continuous Attractor Neural Networks

at CNS 2006 in Edinburgh



Si Wu ( University of Sussex, UK) and Thomas Trappenberg ( Dalhousie University, Canada)


CANNs are a special type of recurrent networks that have been studied in many neuroscientific areas such as modelling hypercolumns, working memory, population coding and attention. Such neural field models of the Wilson-Cowan type, or bump models, are a fundamental type of neural networks that have many applications in neuroscientific modelling and engineering. The goal of this workshop is to bring together researchers from diverse areas to solidify research on CANNs including their theoretical underpinning and practical application.


July 20, Morning Section:

9:00-9:10: Welcome Reception
9:10-9:30: Thomas Trappenberg (Halifax, Canada) Background Review
9:30-9:50: Gregor Schoner (Bochum, Germany) Motor preparation and working memory
9:50-10:10: Zhaoping Li (London, UK) Oscillation attractor
10:10-10:30: Thomas Wennekers (Plymouth, UK) Spatio-temporal neural field approximation

10:30-10:50: Tea Break

10:50-11:10: Eric Sauser (Lausanne, Swizerland) Movement generation
11:10-11:30: Kukjin Kang (RIKEN, Japan) Orientation selectivity
11:30-11:50: Nicolas Rougier (Villers les Nancy, France) Attention

July 20, Afternoon Section:

1:30-1:50: Peter Latham (Gatsby Computational Neuroscience Unit, UK) Population coding
1:50-2:10: Kosuke Hamaguchi (RIKEN, Japan) Correlation structure in CANN
2:10-2:30: Hiroshi Okamoto (RIKEN, Japan) Information retrieval
2:30-2:50: Hecke Schrobsdorff (Goettingen, Germany) Symmetry breaking in a two dimensional CANN

2:50-3:00: Tea Break

3:00-4:00: Discussions (Si Wu)