Learning to integrate sensory signals: from models to artificial proprioception

CNBC Colloquium
Center for the Neural Basis of Cognition (CNBC)

Learning to integrate sensory signals: from models to artificial proprioception

Professor and Jack D. and Deloris Lange Endowed Chair in Cell Physiology
University of California, San Francisco
November 19, 2015 - 4:00pm
Mellon Institute 328

Abstract:

Learning plays a central role in the development and maintenance of multisensory spatial processing, but the mechanisms of multisensory learning are not known. We have shown how a simple network model can learn de novo to perform and maintain a variety of multisensory spatial computations in a statistically optimal fashion. In the model, learning is accomplished by an unsupervised, Hebbian-like learning rule, driven only by the common statistics of the network inputs, e.g., by spatiotemporal correlations between sensory modalities.  Motivated by this observation, we demonstrated experimentally that such correlated inputs do drive de novo multisensory learning.  Macaque monkeys were first trained to perform a reaching task under the guidance of visual feedback. They were then exposed to a novel, artificial feedback signal delivered via multi-electrode intracortical microstimulation (ICMS).  After training with correlated visual and ICMS feedback, the animals were able to perform precise movements with the artificial signal alone.  Furthermore, they combine the ICMS signal with vision in a statistically optimal fashion, as would be done for two natural stimuli.  These results suggests a new route to studying multisensory processing in the brain.  They also point the way to a novel learning-based approach to artificial feedback for brain-machine interfaces.