Computational Neuroscience

4G3 Computational Neuroscience

Note that for the 2016-17 academic year, this module is running in DAMTP and so the information below may not be accurate.
timing: Lent term, 16 lectures [see also lecture schedule]
assessment: 100% coursework
recommended: 3G2 or 3G3

The course demonstrates how mathematical analysis and ideas from engineering-related disciplines (dynamical systems, signal processing, machine learning, optimal control, and probabilistic inference) can be applied to gain insight into the workings of the nervous system. The course highlights a number of real-world computational problems that need to be tackled by any 'intelligent’ system, as well as the solutions that biology offers to some of these problems. The treatment is fairly mathematical and the coursework involves writing and running programs to gain hands-on experience with the subject.


lecture material marked with a # are from previous years, and are continuously updated with material from this year's course

DateDayLecturerTopic (tentative)Relevant reading*lecture materials
Jan 15FriLengyelintroduction, neural coding external link: introduction slides
Jan 20WedBarrettnetworksCh 7external link: networks slides, part 1
Jan 22FriBarrettnetworksCh 7external link: networks slides, part 2
Jan 27WedHennequinE-I balanceCh 7external link: excitation/inhibition balance slides
Jan 29FriHennequinE-I balanceCh 7 
Feb 3WedLengyelneural encodingCh 1external link: neural coding slides
external link: neural encoding slides
Feb 5FriLengyelneural encodingCh 2 
Feb 10WedLengyelneural decodingCh 3external link: neural decoding slides
Feb 12FriLengyelneural decodingCh 3 
Feb 17WedLengyelautoassociative memoryCh 7-8external link: associative memories slides
Feb 19FriLengyelautoassociative memoryCh 7-8 
Feb 24WedTurnerrepresentational learningCh 10external link: visual cortex and natural image statistics slides
Feb 26FriTurnerrepresentational learningCh 10 
Mar 2WedTurnerrepresentational learningCh 10 
Mar 4FriHennequinplasticityCh 8
external link: Kempter et al, 1999
external link: plasticity slides
Mar 9WedHennequinplasticityCh 8 

* chapter numbers are from Dayan & Abbott

Wednesdays 12.00-13.00, CUED LR6
Fridays 11.00-12.00, CUED LR6
for directions to find the Department, see external link: here
for directions within the Department, see external link: floor plan


Coursework will mostly involve programming exercises applying theoretical ideas taught during lectures in practice. Some of the exercises may require dealing with preprocessed data sets or code which will be provided in MatLab. Nevertheless, students are free to choose their favourite programming language to solve the exercises. A useful source of examples for the kind of exercises to be expected and for good MatLab coding practice is the exercises of the Dayan & Abbott book available external link: online. Simulation results (figures plus brief verbal descriptions) will need to be submitted with a summary of the main methodological steps involved in obtaining the results.

Also, do not forget to:

  1. Get a coversheet for each piece of coursework that is set on your module. These can be downloaded from external link: here.
  2. Complete the section 'For completion by Part IIB Engineering students' -- you will need to know your coursework candidate number (CCN) which you can find on external link: COMET. The coversheet needs to be attached to the coursework and submitted to Dave Gautrey, Group G Administrator (see also below).


 1st assignment
network dynamics, neural coding
download the external link: assignment
2nd assignment
memory, representational learning
download the external link: assignment
download the external link: Hopfield 1982 paper
download the data file external link:
Feb 10assigned 
Feb 25hand in 
Mar 7feedback  
Mar 9 assignedexternal link: Lent ends: Mar 11
Apr 22 hand inexternal link: Easter starts: Apr 19
May 11 feedback

Handing in
Dave Gautrey, Group G Administrator, EIETL post box, 1st floor, CUED Inglis Building (see external link: floor plan for location), before 4pm
penalty for lateness is 20% of the marks available per week started that the work is late (non-negotiable)

More information about coursework may be be posted here as the course proceeds.


This list is meant as a 'menu' for major topics in computational neuroscience. In any year, for obvious practical reasons, lectures will only actually cover a select subset of these topics (see lecture schedule for details).

Principles of computational neuroscience (Dr. M. Lengyel)

  • how is neural activity generated? mechanistic neuron models
    integrate and fire models, spike response models, phase models, phase response curves, firing rate models
  • how to predict neural activity? descriptive neuron models
    neural coding, estimating firing rates, homogeneous and inhomogeneous Poisson processes, tuning curves, variability, spike triggered average and covariance, LNP models and their extensions, population coding, maximum entropy models
  • what should neurons do? normative neuron models
    information theory, entropy, (Shannon) mutual information, infomax for one cell, infomax for population
  • how to read neural activity? neural decoding
    single neuron decoding, signal detection theory, ROC curves, population decoding, dot product, maximum likelihood decoding, Cramer-Rao bound, Fisher information, spike train decoding, probabilistic population codes
  • how to tell a neural network what to do? supervised learning
    learning and memory, generalisation, taxonomy of learning tasks (supervised, unsupervised, semi-supervised, reinforcement), classification and regression, perceptron, learning rules as gradient ascent, multi-layer perceptron, error backpropagation, tempotron
  • how do neural networks remember? auto-associative memory
    attractor networks, binary Hopfield network, extension to graded neurons, probabilistic interpretation, spike timing-based memories
  • how can our brains achieve the goal of life? reinforcement learning
    Bellman equations, temporal difference learning, dopamine signals, neuroeconomics

Neural network dynamics (Dr. D. Barrett, Dr. G. Hennequin)

  • what happens when many neurons are connected? neural networks
    feedforward networks, coordinate transformations, recurrent networks, oscillations and synchrony, excitatory-inhibitory networks, selective amplification, input integration, nonlinear amplification, winner-takes-all dynamics, gain control, and sustained activity

Synaptic plasticity and unsupervised learning (Dr. G. Hennequin)

  • how do neurons reconfigure their connections? plasticity
    Hebbian plasticity, stability, synaptic normalisation, Bienenstock-Cooper-Munro rule, spike timing-dependent plasticity
  • Hebbian plasticity
  • spike timing-dependant plasticity
  • learning receptive fields

Representational learning (Dr. R. Turner)

  • how can neuronal networks learn without being told what to do? unsupervised learning
    connectionism, statistical foundations, Bayes' rule, density estimation, representations of uncertainty, Boltzmann machine, Helmholtz machine, deep belief networks, mutual information, sparse coding
  • Bayesian inference and learning
  • generative models and receptive fields


Main text book
external link: Dayan & Abbott. Theoretical Neuroscience. MIT Press, 2005.
useful exercises with sample data and MatLab code: external link: here

Additional reading
Rieke et al. Spikes. Exploring the Neural Code. MIT Press, 1999.
Gerstner & Kistler. Spiking Neural Networks. Cambridge University Press, 2002.
Rao et al. Probabilistic Models of the Brain: Perception and Neural Function. MIT Press, 2002.

Further reading of potential interest
Izhikevich. Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting. MIT Press, 2006.
MacKay. Information Theory, Inference & Learning Algorithms. Cambridge University Press, 2002.
Sutton & Barto. Reinforcement Learning: An Introduction. MIT Press, 1998.
O'Reilly et al. Computational Explorations in Cognitive Neuroscience: Understanding the Mind by Simulating the Brain. MIT Press, 2000.