Skip to main content

Budapest Computational Neuroscience Forum

Workshop
BCNF
Tuesday, December 12, 2023, 5:00 pm – 6:30 pm

The Budapest Computational Neuroscience Forum is a series of informal monthly meetings of Budapest-based computational neuroscientists and computational cognitive scientists with the aim of facilitating discussion and cooperation among researchers working in different institutes and giving an opportunity to students to present their work and get to know the community. Originally started in 2007, restarted in 2017 and then again in 2023 the Forum is now regularly hosted by Central European University, and followed by a social event, both open to anyone interested.

Events of the Forum are advertised on a mailing list. If you wish to be on this list or have any inquiries about the series, contact Mihály Bányai.

Upcoming meeting:

Time: 17:00, December 12., 2023.

Location: CEU, 1051 Budapest, Nádor u. 15., Room 203.

Speaker: Máté Lengyel (CEU/University of Cambridge)

Optimal information loading into working memory explains dynamic coding in prefrontal cortex

Working memory involves the short-term maintenance of information and is critical in many tasks. The neural circuit dynamics underlying working memory remain poorly understood, with different aspects of prefrontal cortical (PFC) responses explained by different putative mechanisms. By mathematical analysis, numerical simulations, and using recordings from monkey PFC, we investigate a critical but hitherto ignored aspect of working memory dynamics: information loading. We find that, contrary to common assumptions, optimal loading of information into working memory involves inputs that are largely orthogonal, rather than similar, to the late delay activities observed during memory maintenance, naturally leading to the widely observed phenomenon of dynamic coding in PFC. Using a novel, theoretically principled metric, we show that PFC exhibits the hallmarks of optimal information loading. We also find that optimal information loading emerges as a general dynamical strategy in task-optimized recurrent neural networks. Our theory unifies previous, seemingly conflicting theories of memory maintenance based on attractor or purely sequential dynamics, and reveals a normative principle underlying dynamic coding.