Scam Alert

Scam Alert

Please verify and be careful about any phishing and scam attempts from external companies.
All conferences and research programs at IML are free of charge.
We will not ask you for any payments regarding your accommodation or travel arrangements

Workshop: Noise resilience of memory stored in low-dimensional neural manifolds through multiple…, Georg Chechelnizk

Date: 2023-11-30

Time: 15:15 - 16:15

Speaker

Georg Chechelnizki – ELSC, Hebrew University of Jerusalem

Abstract

Short term memory in the brain is theorized to often be implemented by continuous attractor networks, which represent stored variables in persistent neural activity. Since neurons are noisy, the variability in their activity degrades the memory, which can manifest as random diffusion (Burak and Fiete 2012). Derivative feedback is known from control theory to increase robustness against various common perturbations, and it was shown in Lim and Goldman 2013 that it can be implemented via slow excitatory and fast inhibitory synaptic timescales, decreasing memory drift due to weight mistuning. In this work we show that the utility of derivative feedback in neural networks is not limited to synaptic mistuning, but in fact can greatly reduce the degradation of memories due to ongoing neural noise. We start by examining the simple case of a linear attractor network, and then demonstrate that the principles that lead to the mitigation of diffusion generalize to a far more general class of nonlinear networks. To demonstrate this, we first derive a general expression for the diffusion coefficient of a stored variable in an attractor network of Poisson neurons with arbitrary connectivity and synaptic timescales as a generalization of Burak and Fiete 2012. We successfully test our theory on ring networks, inspired by the representation of heading in the central complex of insects. Since the number of neurons in these networks is small, noise driven diffusion is naively expected to be prominent. We find that our theory correctly predicts the increase of memory stability as a function of synaptic timescale differences in such models, when they are endowed with negative derivative feedback. Furthermore, we identify how to engineer the network connectivity such that the stability of the bump position along the attractor is enhanced, yet perturbations to the bump’s structure are not slowed down by the derivative feedback mechanism. Thus, neural activity remains tightly confined to a one dimensional manifold. Insights from our theory allow us to conclude that neurons in head direction cell networks that are commonly thought to be utilized for velocity integration can also act as stabilizers against noise-driven motion.