Communicating gradients for learning using activity dynamics

Speaker

Canadian Institute for Advanced Research (CIFAR)

Host

Boris Katz & Tomaso Poggio
Abstract: Theoretical and empirical results in the neural networks literature demonstrate that effective learning at a real-world scale requires changes to synaptic weights that approximate the gradient of a global loss function. For neuroscientists, this means that the brain must have mechanisms for communicating loss gradients between regions, either explicitly or implicitly. Here, I describe our research into potential means of communicating loss gradients using the dynamics of activity in a population of neurons. Using a combination of computational modelling and two-photon imaging data, I will present evidence suggesting that the neocortex may encode loss gradients related to motor learning and sensory prediction using the temporal derivative of activity in populations of pyramidal neurons.