CSAIL Event Calendar: Previous Series
A Quantitative Theory of Neural Computation
Speaker: Leslie Valiant , Harvard University
Relevant URL: http://theory.csail.mit.edu/toc-seminars/
A central open question of neuroscience is to identify the data structures and algorithms that are used in neural systems to support successive acts of basic tasks such as memorization and association. We describe a theory of neural computation based on three physical parameters: the number n of neurons, the number d of synaptic connections per neuron, and the inverse synaptic strength k expressed as the number of presynaptic action potentials needed to cause a postsynaptic action potential. Our fourth parameter r expresses the number of neurons that represent a real world item. We describe a computational mechanism for realizing hierarchical memorization and other cognitive tasks that implies a relationship among these four parameters. For the locust olfactory system estimates for all four parameters are available and we show that these numbers are in agreement with the theory’s predictions. In human medial temporal lobe neurons that represent invariant concepts have been identified and we offer a quantitative mechanistic explanation of these otherwise paradoxical findings. More generally, we identify two useful regimes for neural computation, one with r and k large where each neuron may represent many items, and another in which r is small, k is 1 and every neuron represents at most one item.