Dertouzos Lecturer Series: A Computational Theory of Cortex and Hippocampus
Speaker: Leslie Valiant , Harvard UniversityContact:
Date: May 3 2012
Time: 4:00PM to 5:30PM
Host: Anant Agarwal, CSAIL
Colleen Russell, 3-0145, firstname.lastname@example.orgRelevant URL:
The brain performs many kinds of computation for which it is challenging to hypothesize any mechanism that does not contradict the evidence. In particular, over a lifetime the brain performs a large number of individual cognitive, most having some dependence on past experience and also long-term effects. It is difficult to reconcile such large scale capabilities, even in principle, with the known resource constraints on cortex, such as low connectivity and low average synaptic strength.
Here we shall describe model neural circuits and associated algorithms that respect the brain's most basic resource constraints and support the execution of large numbers of cognitive actions. These circuits simultaneously support a suite of four basic kinds of task that each requires some circuit modification: memory allocation, association, supervised memorization, and inductive learning of threshold functions. The capacity of these circuits is established via experiments in which sequences of thousands of such actions are simulated by computer, and the circuits created tested for the subsequent efficacy of these actions.
Hierarchical memory allocation to arbitrary depth has the added requirement that a stable number of neurons be assigned to memories at every level. We give a mechanism for this that can be realized in a shallow feedforward network. We suggest that in the brain it is the hippocampus that performs this stable memory allocation.
Leslie Valiant was educated at King's College, Cambridge; Imperial College, London; and at Warwick University where he received his Ph.D. in computer science in 1974. He is currently T. Jefferson Coolidge Professor of Computer Science and Applied Mathematics in the School of Engineering and Applied Sciences at Harvard University, where he has taught since 1982. Before coming to Harvard he had taught at Carnegie Mellon University, Leeds University, and the University of Edinburgh.
His work has ranged over several areas of theoretical computer science, particularly complexity theory, computational learning, and parallel computation. He also has interests in computational neuroscience, evolution and artificial intelligence.
He received the Nevanlinna Prize at the International Congress of Mathematicians in 1986, the Knuth Award in 1997, the European Association for Theoretical Computer Science EATCS Award in 2008, and the 2010 A. M. Turing Award. He is a Fellow of the Royal Society (London) and a member of the National Academy of Sciences (USA).
See other events that are part of Dertouzos Lecturer Series 2011/2012
See other events happening in May 2012