PSY-221F: Computational Neuroscience

PSY-221F is the new course number for PSY-265 formerly taught by Greg Ashby

Course Description

This is a lecture course that surveys computational neuroscience, which is a branch of neuroscience that employs mathematical models, theoretical analysis, and abstractions of the brain to understand the principles that govern development, structure, physiology, and cognitive abilities of the nervous system. We will cover both classical (e.g., GLM, LIF, Hodgkin-Huxley model) and state-of-the-art methods (i.e., deep learning).

By the end of this course, you should be able to:

  • describe how the brain “computes”,
  • describe different methods that computational neuroscientists use to model neural coding,
  • computationally model the biophysics of single neurons and the dynamics of neural networks,
  • fit a computational model to experimental data.

You will gain experience both conceptually and practically, by homework assignments that involve solving problems and implementing computational models. However, this is not primarily a programming course - that is, the main goal is to learn the concepts, not to learn a programming language or particular programming techniques. However, coding examples of the concepts is the best way to demonstrate (and facilitate) your knowledge of them. Lab sections will feature Python & math tutorials, hands-on examples, and guided programming sessions.

Prerequisites

The formal prerequisite is PSY-221B, but the only part of that course that is necessary is the introduction to matrix algebra.

The actual necessary background includes:

  • calculus,
  • some prior exposure to matrix algebra,
  • some prior exposure to Python.

Desirable, but not strictly necessary:

  • prior exposure to differential equations,
  • basic knowledge of neuroscience.

Content

Textbook: Dayan & Abbott (2001)

Topics to be covered:

  • Intro to CompNeuro: concepts, properties of neurons, cell types
  • Neural encoding: spike trains and firing rates, early visual system
  • Neuroelectronics: Electrical properties of neurons, Nernst equation
  • Point neuron models: LIF, Izhikevich neurons, Hodgkin-Huxley neurons
  • Morphological neuron models: synaptic conducances, cable equation, multi-compartment models
  • Network models: firing rate models, feedforward/recurrent models, stochastic networks
  • Plasticity & learning: short & long-term plasticity, reinforcement learning
  • Machine and deep learning: model fitting, GLM, CNN, RNN
  • Applications: sensory systems, language, decision-making, …

Your grade will be determined by biweekly quizzes, homework assignments (drop the lowest) and a take-home final exam.

More information at: https://gauchospace.ucsb.edu/courses/course/view.php?id=10610.