Bulletin of the American Physical Society
APS March Meeting 2017
Volume 62, Number 4
Monday–Friday, March 13–17, 2017; New Orleans, Louisiana
Session P5: Non-equilibrium Dynamics of Neural CircuitsFocus
|
Hide Abstracts |
Sponsoring Units: DBIO GSNP Chair: Tatyana Sharpee, Salk Institute Room: 264 |
Wednesday, March 15, 2017 2:30PM - 3:06PM |
P5.00001: Linear and nonlinear manifolds for neural dynamics Invited Speaker: Sara A Solla The fundamental question of how the dynamics of networks of neurons implement neural computations and information processing remains unanswered. The problem is formidable, as neural activity in any specific area not only reflects its intrinsic population dynamics, but must also represent both inputs to and outputs from that area and the computations being performed. The analysis of neural dynamics in several brain cortices has consistently revealed the existence of low dimensional neural manifolds, spanned by latent variables that capture a significant fraction of neural variability. We focus on motor cortex, and discuss a new model for neural control of movement in which "neural modes" are the generators of motor behaviors. We review existing evidence in support of neural manifolds, and present novel results on the geometric and dynamic similarities between manifolds associated with different motor tasks. We propose that this manifold-based view of motor cortex dynamics may lead to a better understanding of how the brain controls movement. [Preview Abstract] |
Wednesday, March 15, 2017 3:06PM - 3:18PM |
P5.00002: Ergodic properties of spiking neuronal networks with delayed interactions. Agostina Palmigiano, Fred Wolf The dynamical stability of neuronal networks, and the possibility of chaotic dynamics in the brain pose profound questions to the mechanisms underlying perception. Here we advance on the tractability of large neuronal networks of exactly solvable neuronal models with delayed pulse-coupled interactions. Pulse coupled delayed systems with an infinite dimensional phase space can be studied in equivalent systems of fixed and finite degrees of freedom by introducing a delayer variable for each neuron. A Jacobian of the equivalent system can be analytically obtained, and numerically evaluated. We find that depending on the action potential onset rapidness and the level of heterogeneities, the asynchronous irregular regime characteristic of balanced state networks loses stability with increasing delays to either a slow synchronous irregular or a fast synchronous irregular state. In networks of neurons with slow action potential onset, the transition to collective oscillations leads to an increase of the exponential rate of divergence of nearby trajectories and of the entropy production rate of the chaotic dynamics. The attractor dimension, instead of increasing linearly with increasing delay as reported in many other studies, decreases until eventually the network reaches full synchrony [Preview Abstract] |
Wednesday, March 15, 2017 3:18PM - 3:30PM |
P5.00003: Attractor neural networks with resource-efficient synaptic connectivity Cengiz Pehlevan, Anirvan Sengupta Memories are thought to be stored in the attractor states of recurrent neural networks. Here we explore how resource constraints interplay with memory storage function to shape synaptic connectivity of attractor networks. We propose that given a set of memories, in the form of population activity patterns, the neural circuit choses a synaptic connectivity configuration that minimizes a resource usage cost. We argue that the total synaptic weight ($l_1$-norm) in the network measures the resource cost because synaptic weight is correlated with synaptic volume, which is a limited resource, and is proportional to neurotransmitter release and post-synaptic current, both of which cost energy. Using numerical simulations and replica theory, we characterize optimal connectivity profiles in resource-efficient attractor networks. Our theory explains several experimental observations on cortical connectivity profiles, 1) connectivity is sparse, because synapses are costly, 2) bidirectional connections are overrepresented and 3) are stronger, because attractor states need strong recurrence. [Preview Abstract] |
Wednesday, March 15, 2017 3:30PM - 3:42PM |
P5.00004: Metastable neural dynamics mediates expectation Luca Mazzucato, Giancarlo La Camera, Alfredo Fontanini Sensory stimuli are processed faster when their presentation is expected compared to when they come as a surprise. We previously showed that, in multiple single-unit recordings from alert rat gustatory cortex, taste stimuli can be decoded faster from neural activity if preceded by a stimulus-predicting cue. However, the specific computational process mediating this anticipatory neural activity is unknown. Here, we propose a biologically plausible model based on a recurrent network of spiking neurons with clustered architecture. In the absence of stimulation, the model neural activity unfolds through sequences of metastable states, each state being a population vector of firing rates. We modeled taste stimuli and cue (the same for all stimuli) as two inputs targeting subsets of excitatory neurons. As observed in experiment, stimuli evoked specific state sequences, characterized in terms of `coding states', i.e., states occurring significantly more often for a particular stimulus. When stimulus presentation is preceded by a cue, coding states show a faster and more reliable onset, and expected stimuli can be decoded more quickly than unexpected ones. This anticipatory effect is unrelated to changes of firing rates in stimulus-selective neurons and is absent in homogeneous balanced networks, suggesting that a clustered organization is necessary to mediate the expectation of relevant events. Our results demonstrate a novel mechanism for speeding up sensory coding in cortical circuits. [Preview Abstract] |
Wednesday, March 15, 2017 3:42PM - 3:54PM |
P5.00005: Learning about memory from (very) large scale hippocampal networks. Leenoy Meshulam, Jeffrey Gauthier, Carlos Brody, David Tank, William Bialek Recent technological progress has dramatically increased our access to the neural activity underlying memory-related tasks. These complex high-dimensional data call for theories that allow us to identify signatures of collective activity in the networks that are crucial for the emergence of cognitive functions. As an example, we study the neural activity in dorsal hippocampus as a mouse runs along a virtual linear track. One of the dominant features of this data is the activity of place cells, which fire when the animal visits particular locations. During the first stage of our work we used a maximum entropy framework to characterize the probability distribution of the joint activity patterns observed across ensembles of up to 100 cells. These models, which are equivalent to Ising models with competing interactions, make surprisingly accurate predictions for the activity of individual neurons given the state of the rest of the network, and this is true both for place cells and for non-place cells. ~Additionally, the model captures the high-order structure in the data, which cannot be explained by place-related activity alone. For the second stage of our work we study networks of \textasciitilde 2000 neurons. To address this much larger system, we are exploring different methods of coarse graining, in the spirit of the renormalization group, searching for simplified models. [Preview Abstract] |
Wednesday, March 15, 2017 3:54PM - 4:06PM |
P5.00006: Non-Equlibrium Driven Dynamics of Continuous Attractors in Place Cell Networks Weishun Zhong, Hyun Jin Kim, David Schwab, Arvind Murugan Attractors have found much use in neuroscience as a means of information processing and decision making. Examples include associative memory with point and continuous attractors, spatial navigation and planning using place cell networks, dynamic pattern recognition among others. The functional use of such attractors requires the action of spatially and temporally varying external driving signals and yet, most theoretical work on attractors has been in the limit of small or no drive. We take steps towards understanding the non-equilibrium driven dynamics of continuous attractors in place cell networks. We establish an `equivalence principle' that relates fluctuations under a time-dependent external force to equilibrium fluctuations in a `co-moving' frame with only static forces, much like in Newtonian physics. Consequently, we analytically derive a network's capacity to encode multiple attractors as a function of the driving signal size and rate of change. [Preview Abstract] |
Wednesday, March 15, 2017 4:06PM - 4:18PM |
P5.00007: Dimensionality and entropy of spontaneous and evoked rate activity Rainer Engelken, Fred Wolf Cortical circuits exhibit complex activity patterns both spontaneously and evoked by external stimuli. Finding low-dimensional structure in population activity is a challenge. What is the diversity of the collective neural activity and how is it affected by an external stimulus?\\ Using concepts from ergodic theory, we calculate the attractor dimensionality and dynamical entropy production of these networks. We obtain these two canonical measures of the collective network dynamics from the full set of Lyapunov exponents.\\ We consider a randomly-wired firing-rate network that exhibits chaotic rate fluctuations for sufficiently strong synaptic weights. We show that dynamical entropy scales logarithmically with synaptic coupling strength, while the attractor dimensionality saturates. Thus, despite the increasing uncertainty, the diversity of collective activity saturates for strong coupling. We find that a time-varying external stimulus drastically reduces both entropy and dimensionality. Finally, we analytically approximate the full Lyapunov spectrum in several limiting cases by random matrix theory.\\ Our study opens a novel avenue to characterize the complex dynamics of rate networks and the geometric structure of the corresponding high-dimensional chaotic attractor. [Preview Abstract] |
Wednesday, March 15, 2017 4:18PM - 4:54PM |
P5.00008: Transient amplification and short term memory in neural circuits Invited Speaker: Kenneth Miller |
Wednesday, March 15, 2017 4:54PM - 5:06PM |
P5.00009: Feature to prototype transition in neural networks Dmitry Krotov, John Hopfield Models of associative memory with higher order (higher than quadratic) interactions, and their relationship to neural networks used in deep learning are discussed. Associative memory is conventionally described by recurrent neural networks with dynamical convergence to stable points. Deep learning typically uses feedforward neural nets without dynamics. However, a simple duality relates these two different views when applied to problems of pattern classification. From the perspective of associative memory such models deserve attention because they make it possible to store a much larger number of memories, compared to the quadratic case. In the dual description, these models correspond to feedforward neural networks with one hidden layer and unusual activation functions transmitting the activities of the visible neurons to the hidden layer. These activation functions are rectified polynomials of a higher degree rather than the rectified linear functions used in deep learning. The network learns representations of the data in terms of features for rectified linear functions, but as the power in the activation function is increased there is a gradual shift to a prototype-based representation, the two extreme regimes of pattern recognition known in cognitive psychology. [Preview Abstract] |
Wednesday, March 15, 2017 5:06PM - 5:18PM |
P5.00010: Abstract Withdrawn |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700