Bulletin of the American Physical Society
APS March Meeting 2023
Las Vegas, Nevada (March 5-10)
Virtual (March 20-22); Time Zone: Pacific Time
Session D11: Physics of Neural Systems IIFocus
|
Hide Abstracts |
Sponsoring Units: DBIO Chair: Tiberiu Tesileanu, Flatiron Institute Room: Room 203 |
Monday, March 6, 2023 3:00PM - 3:36PM |
D11.00001: Detecting assemblies of coordinated neurons with a novel family of maximum entropy models Invited Speaker: Clelia de Mulatier The last decades have seen significant developments in experimental techniques enabling the simultaneous recording of the activity of thousands of neurons. This opens up the possibility of studying emergent macro-structures in neuronal population activity. However, understanding how to extract robust patterns from such data remains challenging, both due to its high dimensionality and to the relatively small number of datapoints. In this context, recent works have been focusing on detecting groups of neurons with coordinated (or highly correlated) activity, often called cell assemblies. |
Monday, March 6, 2023 3:36PM - 3:48PM |
D11.00002: Baseline control of optimal performance in recurrent neural networks Luca Mazzucato, Francesco Fumarola, Shun Ogawa Changes in an animal's behavioral state, such as arousal and movements, induce opposite effects on task performance depending on the sensory modality (visual vs. auditory). Experimental studies showed that the changes in behavioral states are mediated by modulation of the baseline inputs current to populations of neurons in sensory areas. Here, we investigate the benefits of these modulations using a reservoir computing approach, modeling a sensory area as a recurrent neural network, and changes in brain state as modulations of its baseline inputs. In our brain-inspired framework for reservoir computing, we found that the dynamical phase of a recurrent neural network is controlled by modulating the mean and quenched variance of its baseline inputs. Baseline modulation unlocks a zoo of new phenomena. First, we found that baseline modulation drive a novel noise-induced enhancement of chaos. Second, baseline modulations unlocked access to a large repertoire of network phases. On top of the known fixed point and chaotic ones, we uncovered several new bistable phases, where the network activity breaks ergodicity and exhibits the simultaneous coexistence of a fixed point and chaos, of two different fixed points, or of weak and strong chaos. The bistable phases give rise to ergodicity breaking. By driving the network with adiabatic changes in the baseline statistics one can toggle between the different phases, charting a trajectory in phase space. These trajectories exhibited the new phenomenon of neural hysteresis, whereby adiabatic transitions across a phase boundary retain the memory of the adiabatic trajectory. Finally, we showed that baseline control can achieve optimal performance in a sequential memory task at a second-order phase boundary without any fine tuning of the network recurrent couplings. Our results show that baseline control of network dynamics opens new directions for brain-inspired artificial intelligence and provides a new interpretation for the ubiquitously observed behavioral modulations of cortical activity, enabling behavioral flexibility. |
Monday, March 6, 2023 3:48PM - 4:00PM |
D11.00003: Pitch Acts as a Distractor in Optimal Estimation of Yaw in Dynamic Natural Scenery Charles Edelson, Robert de Ruyter, William S Bialek, Shiva R Sinha Estimating motion as a function of visual signals is an important biological problem that is behaviorally relevant to many organisms. The need for efficient and accurate estimators of motion implies there is a strong evolutionary pressure for biological systems to develop optimal estimators of motion. However, surprisingly, many biological motion estimators have strong systematic biases. One possible explanation for this is these biases arise in response to features of the statistics of natural scenes. Here, we computationally investigate this problem in the context of the blowfly (Calliphora vicina) visual system. To do this, we first sampled the joint distribution of fly visual inputs and motion traces with a specially designed "FlEye" camera equipped with joint gyroscope and accelerometer. Using this library of motion traces, we were then able to computationally construct and investigate a family of optimal local estimators of yaw. We show that conditioning yaw estimation on pitch intensity significantly changes the qualitative behavior of the optimal estimator. Higher absolute pitch intensity results in a more diffuse yaw estimator, suggesting pitch acts as a distractor motion in yaw estimation. This distractor behavior mimics previous results found in the motion encodings of a wide-field motion sensitive cell in the blowfly visual system, suggesting that this form of interference is relevant in a biological context. |
Monday, March 6, 2023 4:00PM - 4:12PM |
D11.00004: On the context-dependent efficient coding of olfactory spaces Gaia Tavoni, ShiNung Ching, Baranidharan Raman Sensory neural representations are modulated by a variety of contextual factors, such as multi-modal cues, stimulus history, novelty, behavioral utility, and internal states. Despite decades of attention in systems neuroscience, many questions persist regarding how sensory codes adapt to these different variables. Here, we study this problem in the olfactory system. We present an integrative approach combining normative theories of context-enhanced efficient coding and mechanistic models of neural circuits to generate predictions that will be tested in electrophysiology and behavioral experiments. Our theory is based on the information-theoretic premise that optimal codes strive to maximize the overall entropy (decodability) of neural representations while minimizing neural costs. A novel feature of our approach consists in incorporating feedback into this framework, which allows us to predict how optimal odor representations depend on top-down contextual signals and their covariance with odor spaces. We also show how normative solutions can be implemented at the level of neural circuits through various forms of plasticity. Our theory is generalizable to other sensory circuits and establishes a conceptual foundation for studying sensory coding associated with behavior. |
Monday, March 6, 2023 4:12PM - 4:24PM |
D11.00005: Emergent neural network behavior in dynamically driven superconducting loop disordered systems Uday S Goteti, Shane A Cybart, Robert C Dynes Disordered systems with an irregular energy landscape comprising finitely many local minima are ideal to model computational properties observed in the biological brain, consistent with the neural network model developed by Hopfield [1]. Computational properties, such as categorization, associative memory, time-sequence retention, etc., can be understood as the phase-space flow of the state of the system in response to external excitations. We present a system of superconducting loops coupled through Josephson junctions with disorder introduced into the geometry of the loops and junctions. The loops trap multiples of quantized magnetic flux allowing a multi-dimensional state-space of flux configurations, while the Josephson junctions allow traversal of flux between the loops, as a mechanism to update the state of the system. External excitations drive the system into different trapped flux states, observed as circulating supercurrents around the loop. A 3-loop disordered network is shown in simulations and experiments to exhibit emergent behavior such as categorization and associative memory. Additionally, the dynamics of the internal state of the system is statistically correlated with outgoing flux from the network, that can be observed as a spiking voltage, with each spike corresponding to a magnetic flux quantum Φ0 = 2.067 x 1015 T·m2. |
Monday, March 6, 2023 4:24PM - 4:36PM |
D11.00006: Grid Cell Percolation Yuri A Dabaghian Grid cells are famous for their ability to fire near vertexes of a planar triangular lattice tiling the navigated environment, which is commonly viewed as an instrument for encoding the animal's ongoing location and constructing neuronal spatial metrics. However, direct simulations demonstrate that most grid cells' spiking is highly intermittent and hence does not convey orderliness of underlying grid field layouts to the downstream networks. Yet we argue that regular grid cell activity exists and is based on percolation phenomena. Indeed, viewing a grid cell's firing as "opening" of the corresponding grid-field vertex and a consecutive spiking over two neighboring fields as an "opening" of the lattice edge casts grid cell activity into percolation theory framework. Unlike many standard models of lattice percolation, grid cell percolation depends on multiple parameters, e.g., the animal's speed, neuronal firing rates, receptive field sizes, etc. Surprisingly, all these parameters appear to be tuned to permit percolation, which points at biological viability of the approach and suggests that the grid cell network may in fact operate in a percolative phase. The percolation perspective casts a new light on the role of grid cells in organizing spatial cognition and helps understanding several neurophysiological mechanisms of spatial information processing, such as path integration, spatial planning and establishing global spatial scales. |
Monday, March 6, 2023 4:36PM - 5:12PM |
D11.00007: Flexible multitask computation in recurrent networks utilizes shared dynamical motifs Invited Speaker: Laura Driscoll Flexible computation is a hallmark of intelligent behavior. Yet, little is known about how neural networks contextually reconfigure for different computations. Humans are able to perform a new task without extensive training, presumably through the composition of elementary processes that were previously learned. Cognitive scientists have long hypothesized the possibility of a compositional neural code, where complex neural computations are made up of constituent components; however, the neural substrate underlying this structure remains elusive in biological and artificial neural networks. Here we identified an algorithmic neural substrate for compositional computation through the study of multitasking artificial recurrent neural networks. Dynamical systems analyses of networks revealed learned computational strategies that mirrored the modular subtask structure of the task-set used for training. Dynamical motifs such as attractors, decision boundaries and rotations were reused across different task computations. For example, tasks that required memory of a continuous circular variable repurposed the same ring attractor. We show that dynamical motifs are implemented by clusters of units and are reused across different contexts, allowing for flexibility and generalization of previously learned computation. Lesioning these clusters resulted in modular effects on network performance: a lesion that destroyed one dynamical motif only minimally perturbed the structure of other dynamical motifs. Finally, modular dynamical motifs could be reconfigured for fast transfer learning. After slow initial learning of dynamical motifs, a subsequent faster stage of learning reconfigured motifs to perform novel tasks. This work contributes to a more fundamental understanding of compositional computation underlying flexible general intelligence in neural systems. We present a conceptual framework that establishes dynamical motifs as a fundamental unit of computation, intermediate between the neuron and the network. As more whole brain imaging studies record neural activity from multiple specialized systems simultaneously, the framework of dynamical motifs will guide questions about specialization and generalization across brain regions. |
Monday, March 6, 2023 5:12PM - 5:24PM |
D11.00008: Theory of Coupled Neuronal-Synaptic Dynamics David G Clark, Larry F Abbott In neural circuits, synapses influence neurons by shaping network dynamics, and neurons influence synapses through activity-dependent plasticity. Motivated by this fact, we study a network model in which neurons and synapses are mutually coupled dynamic variables. Model neurons follow recurrent dynamics shaped by synaptic couplings that fluctuate, in turn, about quenched random strengths in response to pre- and postsynaptic neuronal activity. Using dynamical mean-field theory, we compute the phase diagram of the combined neuronal-synaptic system in the thermodynamic limit, revealing several novel phases suggestive of computational function. In the regime in which the plasticity-free system is chaotic, Hebbian plasticity slows chaos, while anti-Hebbian plasticity quickens chaos and generates an oscillatory component in neuronal activity. Deriving the spectrum of the joint neuronal-synaptic Jacobian reveals that these behaviors manifest as differential effects of eigenvalue repulsion. In the regime in which the plasticity-free system is quiescent, Hebbian plasticity can induce chaos. In both regimes, sufficiently strong Hebbian plasticity creates exponentially many stable neuronal-synaptic fixed points that coexist with chaotic states. Finally, in chaotic states with sufficiently strong Hebbian plasticity, halting synaptic dynamics leaves a stable fixed point of neuronal dynamics, freezing the neuronal state. This phase of freezable chaos provides a novel mechanism of synaptic working memory in which a stable fixed point of neuronal dynamics is continuously destabilized through synaptic dynamics, allowing any neuronal state to be stored as a stable fixed point by halting synaptic plasticity. |
Monday, March 6, 2023 5:24PM - 5:36PM |
D11.00009: Potential landscapes and the visual cortex: A study of brain wave entrainment using Neural Mass Models Richa Phogat, P. Parmananda, Ashok Prasad Electroencephalography (EEG) signals emanate from ionic currents in macroscopic brain regions directly below the scalp. A spectral analysis of the signal shows that they are composed of many oscillations, often called brain waves. Of these the alpha frequency in the 8-12 Hz band is one of the best studied. Intriguingly the alpha frequency is capable of entrainment to a periodic visual signal of a similar frequency. The work presented here began with the observation that in human volunteers the entrained oscillations not only displayed higher harmonics, but also subharmonic oscillations. However, these could only be detected when the visual stimulus was at a 10 Hz frequency and was delivered at higher intensity. Intriguingly, subharmonics were not observed when the stimulating frequency was 6Hz. We analyze a physiologically inspired Neural Mass Model (NMM) and show that it was able to reproduce this observation. The bifurcation structure of the model implied a quasi-potential landscape that helped explain entrainment and subharmonics in the NMM, in agreement with observations. Our work [1] suggests that NMMs may be capturing features of underlying effective landscapes that help shape brainwaves and may affect sensory information processing. |
Monday, March 6, 2023 5:36PM - 5:48PM |
D11.00010: MiV-Simulator: A computational framework to simulate exact scale in-vitro neuronal networks Gaurav Upadhyay, Seung Hyun Kim, Frithjof Gressmann, Ivan Raikov, Zhi Dou, Xiaotian Zhang, Ivan Soltesz, Lawrence Rauchwerger, Mattia Gazzola In vitro studies of neuronal networks have played a pivotal role in expanding our understanding of neural dynamics, leading to considerable developments in the field of neuroscience. However, in vitro experiments can be demanding in terms of resources and technical skills. To complement experiments, aid their design, and overall accelerate investigation, we present an open-source computational framework for the simulation and analysis of these systems. Our high performance computing solver, based on the NEURON and CoreNEURON simulators, entails detailed biophysical models of neurons and neuronal networks, and allows to resolve the dynamics of millions of neurons. |
Monday, March 6, 2023 5:48PM - 6:00PM Author not Attending |
D11.00011: High Frequency Deep Brain Stimulation Reduces Exaggerated Synchronization in Biophysically Realistic Spiking Neural Networks AmirAli Farokhniaee Synchronization in neuronal ensembles plays an important role for information transfer in the brain, though its exaggeration is a neurophysiological marker of common brain disorders such as Parkinson’s disease (PD). Deep brain stimulation (DBS), a clinically established therapy for PD, delivers electrical pulses at high frequencies to deep brain regions and is known to reduce the high amount of synchrony. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2023 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
1 Research Road, Ridge, NY 11961-2701
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700