Bulletin of the American Physical Society
APS March Meeting 2018
Monday–Friday, March 5–9, 2018; Los Angeles, California
Session V06: Physics of Neural Networks |
Hide Abstracts |
Sponsoring Units: DBIO GSNP Chair: William Bialek, Princeton Univ Room: LACC 153A |
Thursday, March 8, 2018 2:30PM - 2:42PM |
V06.00001: Attractors in Networks of Bistable Neuronal Units with Depressing Synapses Bolun Chen , Paul Miller Populations of neurons with strong excitatory recurrent connections can exhibit bistability in their mean firing rate. This leads to multiple fixed points in a network of weakly-coupled bistable units. Short-term synaptic depression induced by neuronal activity may change the stability of fixed points, enabling transitions between different states in both a stimulus-dependent and history-dependent manner. A sequence of such state-transitions, similar to those seen in neural data, allows the network to encode the history of time-varying information. Therefore, it is of great interest to characterize the fixed points (activity states) and the transitions between them in large networks in response to diverse stimuli. To this end, here we apply the Wilson-Cowan equation to model bistable units with depressing synapses under time-dependent input. For small networks, we analyze the attractors and bifurcations. With biologically relevant parameters, we uncover an invariant subspace where bistable fixed points are bounded by a limit cycle. This subspace provides us with a basis for developing a dimensionality-reduction formalism, which will allow us to compare model results with recorded neural data. |
Thursday, March 8, 2018 2:42PM - 2:54PM |
V06.00002: Scale invariance of the dynamical rules governing neural systems Vidit Agrawal , Woodrow Shew Brain research utilizes various measurement techniques which probe diverse spatial scales of neural activity. A major challenge is to reconcile the small-scale, high-resolution measurements typical in animal research with human brain imaging, which has relatively poor spatial resolution. How do the governing principles of neural network dynamics manifest at different observational length scales? One possibility is that if the system operates in a dynamical regime near a critical point, as suggested by many experiments, then the rules governing the system dynamics may be scale-invariant. Here we confirm this possibility in a computational model and demonstrate a new approach to quantify scale-invariance in experimental data. |
Thursday, March 8, 2018 2:54PM - 3:06PM |
V06.00003: Searching for collective behavior in a small brain Xiaowen Chen , Francesco Randi , Andrew Leifer , William Bialek With only 302 neurons, Caenorhabditis elegans is one of the simplest organisms that exhibits complex neuronal functions such as locomotion, sensing, and associative learning. In larger networks, it is widely believed that function emerges through collective behavior of many interconnected neurons. The development of tools that allow simultaneous recording from a large fraction of all neurons in C. elegans creates the opportunity to ask if such collective behavior is universal, reaching down to the smallest brains. We analyze preliminary experiments on 50+ neurons by building the maximum entropy model that matches the mean activity and pairwise correlations among these neurons. To capture the graded nature of the neuronal responses, we assign each neuron multiple states. These models successfully predict higher order correlations, as well as the activity of single neurons conditional on the rest of the network. The effective energy landscape is glassy, with patterns of activity moving from one basin to another at rates related to the apparent energy barriers separating them. Finally, the parameters of the model indicate that the network operates close to criticality. |
Thursday, March 8, 2018 3:06PM - 3:18PM |
V06.00004: Temporal Organization of Neuronal Avalanches at Criticality Fabrizio Lombardi , Hans Herrmann , Dietmar Plenz , Lucilla De Arcangelis Bursty dynamics characterizes many physical systems. In neuronal networks the near synchronous firing of many neurons gives rise to the so-called neuronal avalanches, a collective phenomenon that is a key feature of the resting brain activity. Experiments at all spatial scales have evidenced that neuronal avalanche sizes and durations follow power law distributions, a typical feature of systems at criticality. Yet, avalanche dynamics in neuronal systems remain poorly understood. In this talk I will focus on the relationship between criticality and temporal structure of avalanches in cortex slice cultures. |
Thursday, March 8, 2018 3:18PM - 3:30PM |
V06.00005: Diversity of Dynamic States in Neural Networks induced by Homeostatic Plasticity Johannes Zierenberg , Jens Wilting , Viola Priesemann Dynamics of spiking neural networks exhibit clear differences, depending whether originating from living organisms or artificially grown cultures. In living organisms, the neural activity shows continuous, fluctuating dynamics, whereas cultured networks develop strong bursts separated by periods of silence. We propose that this is a result of an interplay between (1) network input, which is much weaker in isolated cultures than in the intact brain, and (2) homeostatic plasticity, a slow negative feedback mechanism adapting the neural spike rate. Based on our theoretical work, we predict that homeostasis can be harnessed to tune the dynamic state of a network by altering its input strength. Most importantly, this could allow to abolish the bursts in cultured neurons and render the dynamics brain-like instead -- a key prerequisite to study neurological and psychiatric disorders on the network level under laboratory conditions. |
Thursday, March 8, 2018 3:30PM - 3:42PM |
V06.00006: Salience of grayscale textures from natural statistics Tiberiu Tesileanu , John Briguglio , Ann Hermundstad , Mary Conte , Jonathan Victor , Vijay Balasubramanian The efficient coding hypothesis suggests that the brain takes advantage of statistical regularities in natural scenes to compress sensory data without significant loss of information. This idea leads to predictions about the way in which sensory data is processed. Indeed, this approach has been used to successfully predict the salience of various binary textures given their abundance in natural scenes. The concept of texture in this case was defined in relation to the relative abundances of the 16 possible ways to fill a 2x2 patch with black or white pixels. Due to translational symmetry, this led to a 10-dimensional space for binary textures. |
Thursday, March 8, 2018 3:42PM - 3:54PM |
V06.00007: Self-organization of entorhinal grid modules through commensurate lattice relationships Louis Kang , Vijay Balasubramanian A grid cell is a neuron that only fires when an animal reaches certain locations in its enclosure. These locations form a triangular grid of a certain spatial scale. The medial entorhinal cortex contains many grid cells spanning a wide range of scales. These scales are not distributed smoothly but instead cluster around discrete values separated by constant ratios reported in the range 1.3–1.8. Although this modular organization has been shown to be an efficient encoding of spatial location, its origin is unknown. We propose an extension of the standard continuous attractor model for generating grid responses that naturally produces geometric sequences of scales. By introducing excitatory connections between attractor networks, an otherwise smooth distribution of grid scales becomes modular with discrete transitions between preferred values. Moreover, constant scale ratios between successive modules arise through robust geometric relationships between commensurate triangular grids, whose lattice constants are separated by √3 ≈ 1.7, √7/2 ≈ 1.3, or other ratios. We suggest analyses and experiments that test our model and describe its connection to the Frenkel-Kontorova model of condensed matter physics. |
Thursday, March 8, 2018 3:54PM - 4:06PM |
V06.00008: Optimizing population coding with unimodal tuning curves and short-term memory in continuous attractor networks Hyun Jin Kim , Ila Fiete , David Schwab A widely used tool for quantifying the precision with which a population of sensory neurons encodes the value of an external stimulus is the Fisher Information (FI). Maximizing FI is also a commonly used objective for constructing optimal neural codes. The primary utility and importance of FI arises from its relation to the mean-squared error of an unbiased stimulus estimator through the Cramér-Rao bound. However, it is well-known that when neural firing is sparse, optimizing FI can result in codes that perform very poorly when considering the resulting mean-squared error, a measure with direct biological relevance. Here we construct optimal population codes by minimizing mean-squared error directly and study the scaling properties of the resulting network, focusing on the optimal tuning curve width. We find that the error scales superlinearly with the system size, and this property remains robust in the presence of finite baseline firing. We then extend our results to continuous attractor networks that maintain short-term memory of external stimuli in their dynamics. Here we also find similar scaling properties in the structure of the interactions that minimize bump diffusivity. |
Thursday, March 8, 2018 4:06PM - 4:18PM |
V06.00009: Antagonism in olfactory receptor neurons and its implications for the perception of odor mixtures Gautam Nallamala , Joseph Zak , Massimo Vergassola , Venkatesh Murthy Natural environments feature mixtures of odorants of diverse quantities, qualities and complexities. Olfactory receptor neurons (ORNs) expressing different olfactory receptors are the first layer in the sensory pathway and transmit the olfactory signal to higher regions of the brain. Discriminatory computations are carried out by brain regions such as the olfactory cortex, which receive global information from the ORN ensemble. Yet, the response of ORNs to mixtures is strongly non-additive, and exhibits antagonistic interactions among odorants. Here, we model the processing of mixtures by mammalian ORNs, focusing on the role of inhibitory mechanisms. Theoretically predicted response curves capture experimentally determined responses imaged by a calcium indicator expressed in ORNs of live, breathing mice. We show how antagonism leads to an effective “normalization” of the ensemble response, which arises from a novel mechanism involving the distinct statistical properties of receptor binding and activation, without any recurrent neuronal circuitry. Normalization allows our encoding model to outperform non-interacting models in odor discrimination tasks, and to explain several psychophysical experiments in humans. |
Thursday, March 8, 2018 4:18PM - 4:30PM |
V06.00010: Chimera States with Local Coupling: the 5^{th} Order FitzHugh-Nagumo Model Andrea Welsh , Flavio Fenton The FitzHugh-Nagumo model is a simple, two variable dynamical system that adequately describes many phenomena in excitable biological systems, such as firing neurons. The excitability is modeled via cubic terms added to the otherwise linear differential equations, resulting in either a stable fixed point or a stable limit cycle which describes synchronized oscillating cells. However, chimera states in which stable fixed-point and limit-cycle regions coexist are not described within this model, even though they are observed in the heart and the brain. By adding a 5^{th} order term in the membrane potential to this 3^{rd} order system, we can recover chimeras, dependent on only initial conditions of the cells. Chimeras have previously been shown in systems with non-local coupling. We present this new system with purely local coupling. We study the dynamics of these chimeras in a few situations: in 1-dimensional cables and rings with two different simultaneous dynamics and in 2-dimensional grids representing tissues, with four different simultaneous dynamics. We want to investigate if the introduction of this 5^{th} order term is able to predict new phenomena in biological systems. |
Thursday, March 8, 2018 4:30PM - 4:42PM |
V06.00011: Optimal modular network for multisensory integration He Wang , Wen-Hao Zhang , K.Y. Michael Wong , Si Wu Information from multiple modalities is integrated in the brain in a near-optimal way. Based on a decentralized architecture suggested by physiological and theoretical studies, we investigate how multisensory information is encoded in different components of a Bayes-optimal modular network architecture. In this architecture, each module is able to function independently and cross-talks among them are conveyed by feedforward cross-links and reciprocal links. We found that the unisensory likelihoods are encoded in the same-channel connections and the multisensory prior information is encoded in the cross-talks in a distributed manner. The most striking discovery is that the feedforward cross-links and the reciprocal couplings form an antagonistic pair. The feedforward cross-links are inhibitory in the short range but excitatory in the long range, serving to cancel out noises and improve integration for cues with moderate disparity, whereas the reciprocal links are excitatory in the short range but inhibitory in the long range, stabilizing a more reliable population activity. The complementary role played by different types of cross-talks between multisensory regions can be verified in future experiments on the brain. |
Thursday, March 8, 2018 4:42PM - 4:54PM |
V06.00012: Invariances in a Combinatorial Olfactory Receptor Code Guangwei Si , Jessleen Kanwal , Yu Hu , Christopher Tabone , Jacob Baron , Matthew Berck , Gaetan Vignoud , Aravinthan Samuel Animals can identify an odorant type across a wide range of concentrations, as well as detect changes in concentration for individual odorant. How olfactory representations are structured to support these functions remains poorly understood. Here, we studied how a full complement of olfactory receptor neurons (ORNs) in the Drosophila larva encodes a broad input space of odorant types and concentrations. We find that dose-response relationships across odorants and ORNs follow the Hill function with shared cooperativity but different activation thresholds. These activation thresholds are drawn from a power law distribution. A fixed activation function and power law distribution of activation thresholds underlie invariances in the encoding of odorant identity and intensity. Moreover, we find similar temporal response filters of ORNs across odorant types and concentrations. Such uniformity in the temporal filter may allow identity invariant coding in fluctuating odor environments. Common patterns in ligand-receptor binding and sensory transduction across olfactory receptors may give rise to these observed invariances in the olfactory combinatorial code. Invariant patterns in the activity responses of individual ORNs and the ORN ensemble may simplify decoding by downstream circuits. |
Thursday, March 8, 2018 4:54PM - 5:06PM |
V06.00013: The Control of Brain Activity Across Spatial and Temporal Scales Evelyn Tang , Fabio Pasqualetti , Danielle Bassett Different brain regions are associated with particular functions and distinct propensities for the control of neural dynamics. However, the relation between these functions and control profiles is poorly understood, as is the variation in this relation across spatial and temporal scales. By studying brain networks obtained from diffusion tensor imaging, we probe the control properties of each brain region and investigate their relationship with dynamics across various spatial scales using the Laplacian eigenspectrum. In addition, through analysis of regional modal controllability and partitioning of modes, we determine if associated dynamics are fast or slow, transient or oscillatory. We find that brain regions that facilitate control of energetically easy transitions are associated with activity on short length scales, and slow oscillatory but fast transient modes. Conversely, brain regions that facilitate control of difficult transitions are associated with activity on long length scales, and fast oscillatory but slow transient modes. Built on linear dynamical models, our results offer parsimonious explanations for the dynamics and network control profiles supported by regions of differing neuroanatomical structure. |
Thursday, March 8, 2018 5:06PM - 5:18PM |
V06.00014: The Physical Brain: New Approaches to Brain Structure, Activity, and Function Peter Robinson By viewing the brain as a multiscale physical system it is possible to circumvent the shortcomings of abstract signal-based and statistical approaches to analysis of brain structure, activity, and function. Eigenmode approaches enable the key elements of brain structure to be isolated systematically, along with their effects on brain activity and functional measures. Physically based neural field theory permits tractable analysis from sub-mm scales to the whole brain, demonstrating the near-critical state of normal brain operation, relationships between structure and function, nonlinear dynamics, and phase transitions. Results in normal and abnormal states include experimentally verified predictions of electrical and hemodynamic signals, and the successful inversion of functional correlation measures to infer brain structure, including connectivity that cannot be measured directly. These results illustrate the power of physically based modeling to predict, explain, and unify multiple observations across scales. Furthermore, they open up ways to extend biological physics to a host of new phenomena. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2018 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
1 Research Road, Ridge, NY 11961-2701
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700