Bulletin of the American Physical Society
APS March Meeting 2023
Volume 68, Number 3
Las Vegas, Nevada (March 5-10)
Virtual (March 20-22); Time Zone: Pacific Time
Session M01: Neurodynamical Models of CognitionFocus
|
Hide Abstracts |
Sponsoring Units: GSNP DSOFT DBIO Chair: Jason Kim, Cornell University Room: Room 124 |
Wednesday, March 8, 2023 8:00AM - 8:36AM |
M01.00001: How recurrent neural networks infer dynamical models from data for prediction, inference, and source separation Invited Speaker: Zhixin Lu One of the fantastic properties of neural systems is their ability to flexibly understand and interact with the dynamical world without hard-coding prior models. Given the immense dimensionality and complexity associated with studying any specific neural network, is there a theoretical framework that can explain the cognitive capacity of general dynamical neurons? In this talk, I introduce a framework that allows randomly constructed neural networks to learn and manipulate dynamical models from data via generalized synchronization. By extending the concept of synchronization from frequencies and phases to attractor manifolds, we explain the underlying mechanism with which recurrent neural networks learn and manipulate dynamical models from data. We apply this framework to understand how recurrent neural networks 1) simultaneously learn multiple models, 2) infer unseen variables from partially measured dynamics, 3) separate mixed signals from multiple sources, 4) construct continuous representations from discrete examples, and 5) infer global dynamics from local examples. Together, our results provide a simple but powerful mechanism by which dynamical neural networks can learn internal dynamical representations of the complex dynamical world, enabling the principled study and better designs of artificial intelligence. |
Wednesday, March 8, 2023 8:36AM - 8:48AM |
M01.00002: The dynamics of recurrent neural networks throughout learning at the edge of chaos. Tala Fakhoury Recurrent neural networks (RNNs) adaptively learn representations of the natural world by strengthening and weakening the interactions between neurons. Such adaptation depends heavily on the network architecture, whereby more excitable networks -termed at criticality- exhibit greater performance. However, the precise role of intrinsic excitability for successful learning remains largely unknown. Here we demonstrate that intrinsic excitability enables RNNs to form more stable and robust internal representations whose geometry converges throughout learning. Specifically, we adaptively train RNNs with different intrinsic excitability to learn a chaotic attractor, and quantify the learning process at each training step. We find that the unstable Lyapunov exponents of RNNs near criticality more quickly and robustly converge to those of the true attractor. Further, we find that the geometry of the embedded manifold -as measured by the Hausdorff distance- converges to the true attractor manifold for RNNs near criticality, and does not converge for RNNs far from criticality. Taken together, our results demonstrate that intrinsic excitability enables the converging formation of stable representations, thereby providing insight into how complex representations are learned adaptively. |
Wednesday, March 8, 2023 8:48AM - 9:00AM |
M01.00003: Robust Memory Manifolds in Neural Networks Tankut U Can, Kamesh Krishnamurthy The ability to store continuous variables in the state of a biological system (e.g. a neural network) is critical for many behaviors. Most models for implementing such a memory manifold require hand-crafted symmetries in the interactions or precise fine-tuning of parameters. We present a general principle that we refer to as frozen stabilization (FS), which allows a family of neural networks to self-organize to a dynamically critical state exhibiting multiple memory manifolds without parameter fine-tuning or symmetries. We find that FS gives rise to networks with a true continuum of fixed points that can function as precise general purpose neural integrators. The network attractor has a complex global geometry, consisting of a union of multiple uncorrelated continuous attractor "maps". Even on a single map, there is a broad range of relaxation timescales which vary along the attractor. Moreover, FS easily produces robust, low-dimensional memory manifolds in small systems with as few as two neurons. This bears directly upon recent experiments uncovering continuous attractor dynamics in small networks like the fly brain. In summary, frozen stabilization leads to robust continuous attractors and a wide range of timescales in recurrent neural networks, without parameter fine-tuning or special symmetries, and without the need for learning. Such memory manifolds could be useful to model biological implementations of integrators or cognitive maps. |
Wednesday, March 8, 2023 9:00AM - 9:12AM |
M01.00004: Mean Field trajectories in a spin model for decision making on the move Dan Gorbonos, Nir S Gov, Iain Couzin How animals navigate and perform directional decision making while migrating and foraging, is an open puzzle. We have recently proposed a spin-based model for this process, where each optional target that is presented to the animal is represented by a group of Ising spins, that have all-to-all connectivity, with ferromagnetic intra-group interactions. The inter-group interactions are in the form of a vector dot product, depending on the instantaneous relative, and deformed, angle between the targets. The deformation of the angle in these interactions enhances the effective angular differences for small angles, as was found by fitting data from several animal species. We expose here the rich variety of trajectories predicted by the mean-field solutions of the model, for systems of three and four targets. We find that depending on the arrangement of the targets the trajectories may have an infinite series of self-similar bifurcations, or have a space-filling property. The bifurcations along the trajectories occur on "bifurcation curves'', that determine the overall nature of the trajectories. The angular deformation that was found to fit experimental data, is shown to greatly simplify the trajectories. This work demonstrates the rich space of trajectories that emerge from the model. |
Wednesday, March 8, 2023 9:12AM - 9:24AM |
M01.00005: Transitions in Classical Dynamical Systems Negin Moharrami Allafi, Enrique Pujals, Vadim Oganesyan We have been working to elucidate critical properties of transitions in classical dynamical systems by looking at 1-D and 2-D recurrence maps namely the circle map and Chirkov standard map. |
Wednesday, March 8, 2023 9:24AM - 9:36AM |
M01.00006: Frequency Shifting and Induced Stability in Systems of Asymmetrically Coupled Oscillators Joseph C McKinley, Mengsen Zhang, Alice Wead, Christine Williams, Emmanuelle Tognoli, Christopher Beetle The Haken-Kelso-Bunz (HKB) equations describe bistable rhythmic coordination phenomena, which are ubiquitous in biophysical motor, neural, and social systems. Although originally formulated for pairs of coupled oscillators, the HKB model has recently been generalized to larger systems of oscillators with diverse natural frequencies. Existing work on the generalized HKB model has been mostly restricted to the case of symmetric coupling, where any two coupled oscillators equally influence each other's dynamics. However, in natural systems such as social systems and the brain, influences between components are often unequal. In the present work, we generalize our previous work by allowing each oscillator's coupling strength to be asymmetrical, where a given oscillator may be more sensitive to the influence of its partners than vice versa. We find that asymmetric coupling changes the collective dynamics of the oscillator system, shifting the frequency at which the system coordinates. We also show that the phenomenon of induced stability, whereby subsystems of oscillators can be sustained in coordination patterns that would not be stable in isolation, is robust in the presence of asymmetric coupling. Finally, we discuss some applications of this theoretical work in gerontology and neurostimulation. |
Wednesday, March 8, 2023 9:36AM - 9:48AM |
M01.00007: Dynamics of model oscillatory neuronal networks with adaptive synaptic weights and structure Kanishk Chauhan, Ali Khaledi-Nasab, Alexander B Neiman, Peter A Tass We study the dynamics of phase oscillator networks with variable coupling strength and structure that can represent oscillatory neuronal networks where the spiking dynamics, synaptic weights, and network structure influence each other. We model synaptic weight adaptation by spike-timing-dependent plasticity (STDP) with a longer time scale than neuronal spiking and structural plasticity (SP) that alters the network architecture by adding and eliminating synaptic contacts at a longer time scale than STDP. We study the steady-state dynamics of networks that can settle either in synchronized or desynchronized states. We show that a combination of SP and STDP (STDP+SP) allows for a synchronized state with fewer links than a network with STDP only. With non-identical units, STDP+SP leads to correlations between the oscillators’ natural frequencies and node degrees. In a desynchronized state, STDP+SP leads to a sparser network. In this way, adding SP strengthens both synchronized and desynchronized states. Using a desynchronizing coordinated reset stimulus and a periodic synchronizing stimulus, we show that a network of identical oscillators with STDP+SP may require stronger and longer stimulation to switch between the states compared to a network with STDP only. Furthermore, we confirmed the emergence of a correlation between a neuron’s firing rate and degree using a leaky integrate & fire model of neurons. |
Wednesday, March 8, 2023 9:48AM - 10:00AM |
M01.00008: High-Order Accuracy Computation of Coupling Functions for Strongly Coupled Oscillators Youngmin Park We develop a general framework for identifying phase-reduced equations for finite populations of coupled oscillators that is valid far beyond the weak coupling approximation. This strategy represents a general extension of the theory from [Wilson and Ermentrout, Phys. Rev. Lett., 123 (2019), 164101] and yields coupling functions that are valid to higher-order accuracy in the coupling strength for arbitrary types of coupling (e.g., diffusive, gap-junction, and chemical synaptic). These coupling functions can be used to understand the behavior of potentially high-dimensional, nonlinear oscillators in terms of their phase differences. The proposed formulation accurately replicates nonlinear bifurcations that emerge as the coupling strength increases and is valid in regimes well beyond those that can be considered using classic weak coupling assumptions. We demonstrate the performance of our approach through two examples. First, we use the diffusively coupled complex Ginzburg--Landau (CGL) model and demonstrate that our theory accurately predicts bifurcations far beyond the range of existing coupling theory. Second, we use a realistic conductance-based model of a thalamic neuron and show that our theory correctly predicts asymptotic phase differences for nonweak synaptic coupling. In both examples, our theory accurately captures model behaviors that weak coupling theories can not. |
Wednesday, March 8, 2023 10:00AM - 10:12AM |
M01.00009: Interpreting polychronization through the lens of tropical geometry Matthew W Daniels, Advait Madhavan, Mark D Stiles Timing information of neural spikes is thought to be one of many ways in which the brain encodes information. In the temporal coding regime, synaptic delays become crucial for understanding cognitive dynamics, and spiking neural networks with nontrivial synaptic delays are known to express certain complex behaviors such as polychronization. In this talk, we explore the link between such delay networks and a field of mathematics called tropical algebra, which has found applications in fields from optimization and control to field theory. We demonstrate that certain classes of polychronous patterns in spiking integrate-and-fire neural networks are in fact vertices of tropical eigenspaces, and consider what other lessons tropical geometry may have to tell us about temporal dynamics in the brain and in temporally-coded neuromorphic systems. |
Wednesday, March 8, 2023 10:12AM - 10:24AM |
M01.00010: High-order phase reduction applied to remote synchronization Michael Rosenblum We discuss the analytical and numerical approaches to phase reduction in networks of coupled oscillators in the higher orders of the coupling parameter. Particularly, for three coupled Stuart–Landau (SL) oscillators, where the phase can be introduced explicitly, an analytic perturbation procedure yields the explicit second-order approximation [1]. We exploit the analytical result from [1] to analyze the mechanism of the remote synchronization (RS). RS, briefly reported by Okuda and Kuramoto as early as 1991, implies that oscillators interacting not directly but via an additional unit (hub) adjust their frequencies and exhibit frequency locking while the hub remains asynchronous. Previous studies uncovered the role of amplitude dynamics and of nonisochronicity: RS appeared in a network of isochronous SL units but not in its first-order phase approximation, the Kuramoto network. Furthermore, RS emerged in networks of phase oscillators with the Kuramoto-Sakaguchi interaction, but not in the case of zero phase shift in the sine-coupling term; this result indicates the role of nonisochronicity. In this work, we analytically demonstrate the role of two factors promoting remote synchrony. These factors are the nonisochronicity of oscillators and the coupling terms appearing in the second-order phase approximation. We explain the contribution of both factors and quantitatively describe the transition to RS. We demonstrated that the RS transition is determined by the interplay of the nonisochronicity and the amplitude dynamics. The impact of the latter factor renders the standard first-order phase dynamics descripion of the RS phenomenon invalid. Our result emphasizes the importance of higher-order phase reduction and highlights the crucial role amplitude dynamics may have in governing the behavior of networks of nonlinear oscillators. We show a good correspondence between our theory and numerical results for small and moderate coupling strengths and argue that the effect of the amplitude dynamics neglected in the first-order phase approximation and revealed by the higher-order one holds for general limit-cycle oscillators. |
Wednesday, March 8, 2023 10:24AM - 10:36AM |
M01.00011: Theory of Communication Subspaces for Stochastic Recurrent Neural Networks Shivang Rawat, David J Heeger, Stefano Martiniani The brain relies on communication between specialized cortical areas to accomplish complex cognitive tasks. To fully understand and replicate this ability of the brain in artificial systems, we need a deeper understanding of information transfer across cortical areas. We reduce this gap by developing analytical tools to analyze interareal communication in terms of coherence and noise correlations in subpopulations of neurons within and across areas. We take a dynamical systems approach in designing stable stochastic network architectures yielding systems that display features characteristic of neuronal networks. We then show that for a rather broad class of systems we can derive analytical expressions for correlations, power spectra, and coherence. Based on this analysis, we derive a theory that predicts the emergence of communication subspaces as a mechanism for interareal communication, as observed in recent experiments. We illustrate these approaches for two distinct types of circuit models: 1) A dynamically stable stochastic recurrent convolutional neural network trained on image datasets; and 2) A stochastic recurrent circuit implementing divisive normalization, where the responses of neurons are divided by a weighted sum of the activity of a population of neurons. |
Wednesday, March 8, 2023 10:36AM - 10:48AM |
M01.00012: Modulation of synaptic signaling by calcium feedback loops in dendritic spines Harper Cho Cortical circuits rely on strong recurrent excitations to sustain mental representations. Most of these recurrent excitatory connections are received by small membranous protrusions called dendritic spines that cover the neuron’s dendritic shaft. The specific function of spines has been debated for decades as either being purely biochemical, where calcium gradients implement input-specific synaptic plasticity, or electrical, where spines function as electrical compartments that modify synaptic potentials. Recent experimental evidence has shown, however, that spines are indeed electrically isolated from their parent neurons and are capable of activating independently in subthreshold potentials. This electric isolation can be a mechanism for neuromodulation to control the regime of a cortical circuit’s effective recurrence. In this project, we simulate a spatial neuron model with a single spine on a dendrite and a minimal set of channels to study how voltage-dependent calcium feedback loops and calcium-dependent potassium channels in the spine can modify the neuron synaptic potential, and the implications this may have on cortical circuits like attractor networks. This project also examines the effect of spine morphology on the effectiveness of spine-based modulation. |
Wednesday, March 8, 2023 10:48AM - 11:00AM |
M01.00013: Neurodynamical computing at the information boundaries of intelligent systems Joseph D Monaco, Grace M Hwang Artificial intelligence has not yet achieved defining features of biological intelligence. Here, we synthesize disciplinary approaches to intelligence to argue that methodological and epistemic biases can be resolved by shifting from cognitivist brain-as-computer metaphors to recognizing the extended interdependence of living systems. By integrating the dynamical systems view of cognition with the distributed feedback of perceptual control, we highlight theoretical gaps in understanding neurodynamical function. Cell assemblies—conceived as reentrant energy flows, not merely identified co-firing groups—establish a physical 'base layer' for neurodynamical computing over information streams from embodiment and situated embedding. We place this base layer within evolutionarily conserved oscillatory and structural features of cortical-hippocampal networks. Our approach grounds embodied cognition in dynamical systems and perceptual control to bypass obstacles arising between artificial intelligence, cognitive science, and computational neuroscience. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700