Bulletin of the American Physical Society
APS March Meeting 2022
Volume 67, Number 3
Monday–Friday, March 14–18, 2022; Chicago
Session B03: Neural Systems IFocus Recordings Available
|
Hide Abstracts |
Sponsoring Units: DBIO Chair: Tiberiu Tesileanu, Flatiron Institute Room: McCormick Place W-176A |
Monday, March 14, 2022 11:30AM - 12:06PM |
B03.00001: Neural circuits for dynamics-based segmentation of time series Invited Speaker: Tiberiu Tesileanu The brain must extract behaviorally relevant latent variables from the signals streamed by the sensory organs. Such latent variables are often encoded in the dynamics that generated the signal rather than in the specific realization of the waveform. Therefore, one problem faced by the brain is to segment time series based on underlying dynamics. We present two algorithms for performing this segmentation task that are biologically plausible, which we define as acting in a streaming setting and all learning rules being local. One algorithm is model-based and can be derived from an optimization problem involving a mixture of autoregressive processes. This algorithm relies on feedback in the form of a prediction error, and can also be used for forecasting future samples. In some brain regions, such as the retina, the feedback connections necessary to use the prediction error for learning are absent. For this case, we propose a second, model-free algorithm that uses a running estimate of the autocorrelation structure of the signal to perform the segmentation. We show that both algorithms do well when tasked with segmenting signals drawn from autoregressive models with piecewise-constant parameters. In particular, the segmentation accuracy is similar to that obtained from oracle-like methods in which the ground-truth parameters of the autoregressive models are known. We also test our methods on datasets generated by alternating snippets of voice recordings. |
Monday, March 14, 2022 12:06PM - 12:18PM |
B03.00002: Recording neural activity in freely behaving animals with a random access two photon microscopy Akihiro Yamaguchi, Akihiro Yamaguchi, Paul McNulty, Rui Wu, Jason P Wolk, Marc H Gershow Recording neuroal activity in freely behaving animals is essential to study neural correlates of behavior, but the motion of the brain imposes challenges to reliably record calcium activity from multiple neurons. By implementing acousto-optic deflectors and a custom-designed compact dispersion compensator [1], we developed a random-access two-photon tracking microscope capable of recording activity from neurons in unrestrained freely behaving Drosophila larvae without motion artifacts. With extremely low latency (360 μs), this microscope can relocate the laser beam in constant time regardless of the distance within the field of view, which allows us to overcome the limits imposed by the inertia of the scanning elements that previously limited the number of neurons that can be recorded and the distance between them [2]. |
Monday, March 14, 2022 12:18PM - 12:30PM |
B03.00003: Volumetric, multi-neuronal imaging in freely behaving Drosophila larvae Paul McNulty The Gershow Lab is occupied with trying to answer how living brains make decisions, and we use Drosophila larvae to answer this profound question. By combining our previously published two-photon tracking scheme with the commercially available dual-beam hyperscope from Scientifica, we have achieved fast, volumetric imaging of multiple neurons and extended cell bodies in freely behaving organisms. We demonstrate imaging from motor neurons, pre-motor interneurons, and sensory neurons in freely moving larvae. This brings us closer to understanding decision making, with the potential to identify discrete decision making neurons in a living brain. |
Monday, March 14, 2022 12:30PM - 12:42PM |
B03.00004: Building emergent representation of neural states using dynamical models Josuan Calderon, Gordon J Berman The brain is an intricate system, composed of myriad interacting components across a hierarchy of length and time scales. Understanding the dynamic collective state of these components is crucial for gaining insight into the emergent cognitive functions that ultimately control our actions and our perception of the world. However, fundamental principles that can provide a clear description of these dynamics remain elusive. Here, we develop a deep-learning framework that can reduce the complex dynamics across the different spatial and temporal scales to stereotyped brain states through fitting our data and associating points in time to differing basins of attraction. We demonstrate the potential of the approach using human electrocorticography (ECoG) data recorded from multiple patients over a period of several days. In addition to states that the brain revisits over time, using our pipeline some portion of the brain's activity can also be captured by repeated spatiotemporal signatures, which correspond to repeated sequences of brain states but also include information about the spatial signatures of activity that occur as one state transitions to another. These findings provide future avenues to not only decode large-scale interactions of complex brain dynamics, but also to associate these brain states to subtle alterations in behavior. |
Monday, March 14, 2022 12:42PM - 12:54PM |
B03.00005: Unveiling the dynamics and structure of drifting neural representations Shanshan Qin Long-term memories and learned behavior are conventionally associated with stable neuronal representations. However, recent experiments showed that neural population codes in many brain areas continuously change even when animals have fully learned and stably perform their tasks. This representational "drift" naturally leads to questions about its causes, dynamics, and functions. Here, we explore the hypothesis that neural representations optimize a representational objective with a degenerate solution space, and noisy synaptic updates drive the network to explore this (near-)optimal space causing representational drift. We illustrate this idea in simple, biologically plausible Hebbian/anti-Hebbian network models of representation learning, which optimize similarity matching objectives, and, when neural outputs are constrained to be nonnegative, learn localized receptive fields (RFs) that tile the stimulus manifold. We find that the drifting RFs of individual neurons can be characterized by a coordinated random walk, with the effective diffusion constants depending on various parameters such as learning rate, noise amplitude, and input statistics. Despite such drift, the representational similarity of population codes is stable over time. Our model recapitulates recent experimental observations in hippocampus and posterior parietal cortex, and makes testable predictions that can be probed in future experiments. |
Monday, March 14, 2022 12:54PM - 1:06PM |
B03.00006: A simple theory of stochastic dynamics near stable fixed points Shivang Rawat, Stefano Martiniani The development of a rigorous understanding of the effect of stochastic fluctuations on the dynamics of diverse physical systems is of broad practical significance. We consider systems of many variables that can be described by nonlinear SDEs driven by additive filtered noise, and revisit the analytical theory of stochastic oscillations near the stable fixed points of their dynamics. Specifically, we introduce a simple approach to deriving analytical expressions for the noise power spectrum and coherency spectrum (and their corresponding correlation functions) in terms of rational polynomial functions and identify compact formulas for certain polynomial coefficients. This is achieved through linearization about the fixed point and systematic application of Itô calculus, without resorting to the computation of the formal Fourier transform of the white noise process. Then, we demonstrate that we can identify model parameters by fitting our solution to an experimental (synthetic) noise spectrum and/or coherence data. We illustrate this approach for two distinct types of dynamical systems: a 5-D nonlinear toy model with additive Gaussian white noise, and Wilson-Cowan type cortical population excitatory-inhibitory neural models with exponentially low-pass filtered Poisson noise. |
Monday, March 14, 2022 1:06PM - 1:18PM |
B03.00007: Stimulation-induced long-lasting desynchronization of plastic neuronal networks Justus A Kromer, Ali Khaledi-Nasab, Peter A Tass Abnormal neuronal synchrony is a hallmark of Parkinson’s disease (PD). Deep brain stimulation is an established treatment; however, symptoms return shortly after stimulation ceases. Utilizing synaptic plasticity, theory-based approaches, such as coordinated reset stimulation (CRS), induce long-lasting effects by reshaping synaptic connectivity to stabilize desynchronized activity after stimulation ceases. Animal and clinical studies demonstrated corresponding long-lasting therapeutic effects. |
Monday, March 14, 2022 1:18PM - 1:30PM |
B03.00008: Closed-loop targeted optogenetic stimulation of C. elegans populations Mochi Liu, Sandeep Kumar, Anuj K Sharma, Andrew M Leifer We present a worm tracking setup for closed-loop tracking and optogenetic stimulation of multiple C. elegans worms. The system addresses three specific challenges: it delivers illumination targeted to specific portions of the worm's body, it increases throughput, and it delivers closed-loop stimuli triggered on the worm's behavior state. We demonstrate this new method by investigating how competing mechanosensory stimuli are processed. We use the instrument to optogenetically activate the anterior and posterior soft touch neurons. We find that the probability of reversal is primarily dependent on anterior stimulus intensity, while the probability of sprinting forward depends on both the anterior and posterior stimulus intensities. To understand how current behavior state affects sensory processing, we delivered almost 10,000 stimulus events to the worms during turn onset. With higher statistical power we confirmed our previous finding that the probability of reversal in response to soft touch stimulus is reduced during turns in comparison to forward locomotion. By providing orders of magnitude greater throughput, we expect this device to make accessible new types of investigations. |
Monday, March 14, 2022 1:30PM - 1:42PM |
B03.00009: Temperature Mediated Transitions in Neuronal Activity Epaminondas Rosa, Rosangela Follmann, Manuela Burek Temperature fluctuations are known to alter biochemical reaction rates, therefore affecting the functional structure of complex cells including neurons. Considering that living organisms are subject not only to daily fluctuations in temperature, but also to overall temperature increase due to climate change, the way neuronal systems respond to temperature may be decisive for the survival of many species. Here we present a mathematical model developed for studying the influence of temperature on neuronal behavior. Our numerical simulations show that increase in temperature within certain ranges also increase neuronal activity, in this case manifested as higher firing rates with shortening of action potential duration. We also find that increased temperatures reduce chaotic transitions between tonic and bursting neuronal regimes. Transitions of this nature have been shown to be of relevance in a number of healthy and pathological neuronal systems. |
Monday, March 14, 2022 1:42PM - 1:54PM |
B03.00010: Autocorrelations in homeostatic spiking neural networks as a result of emergent bistable activity Johannes Zierenberg, Benjamin Cramer, Markus Kreft, Sebastian Billaudelle, Vitali Karasenko, Aron Leibfried, Eric Müller, Philipp Spilger, Johannes Weis, Johannes Schemmel, Viola Priesemann Using a neuromorphic processor, we emulate networks of excitatory and inhibitory leaky integrate and fire neurons with spiking rates regulated by homeostatic plasticity. The latter incorporates stochastic updates that give rise to heterogeneous weight distributions. As predicted by theory, the network becomes more recurrent for decreasing input strength, which manifests in an increase of the autocorrelation time. Surprisingly, this rise can be attributed to emergent bistable population activity that (i) can be well approximated by a hidden Markov model, (ii) does not appear to vanish for increasing system sizes, and (iii) is likely stabilized by the heterogeneous weight distribution. In addition, we show that networks with bistable population activity allow for a more precise, yet slower representation of additional input that may still be read out once the input is removed. |
Monday, March 14, 2022 1:54PM - 2:06PM |
B03.00011: Effects of suppressing inhibitory synaptic strength on the dynamics of a network of spiking neurons: A computational study Emily S.C. Ching, H.Y. Li, G.M. Cheng
|
Monday, March 14, 2022 2:06PM - 2:18PM |
B03.00012: Comparison of the synchronization transition of the Kuramoto model on fruit-fly versus a large human connectome Geza Odor, Gustavo Deco, Jeffrey Kelling The Kuramoto equation has been solved numerically on the 21.662 node fruit-fly [1] and the 804.113 node human connectomes. While the fly neural connectome resembles to a structureless random graph, the KKI-18 grey matter human connectome exhibits a hierarchical modular organization [2]. The synchronization transition of the fly is mean-field like, with a weak hysteresis, but a narrow Griffiths phase cannot be excluded [3] |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700