Bulletin of the American Physical Society
APS March Meeting 2015
Volume 60, Number 1
Monday–Friday, March 2–6, 2015; San Antonio, Texas
Session J33: Focus Session: Physics of Neural Systems II |
Hide Abstracts |
Sponsoring Units: DBIO Chair: Ilya Nemenman, Emory University Room: 208 |
Tuesday, March 3, 2015 2:30PM - 2:42PM |
J33.00001: Super-linear Precision in Simple Neural Population Codes David Schwab, Ila Fiete A widely used tool for quantifying the precision with which a population of noisy sensory neurons encodes the value of an external stimulus is the Fisher Information (FI). Maximizing the FI is also a commonly used objective for constructing optimal neural codes. The primary utility and importance of the FI arises because it gives, through the Cramer-Rao bound, the smallest mean-squared error achievable by any unbiased stimulus estimator. However, it is well-known that when neural firing is sparse, optimizing the FI can result in codes that perform very poorly when considering the resulting mean-squared error, a measure with direct biological relevance. Here we construct optimal population codes by minimizing mean-squared error directly and study the scaling properties of the resulting network, focusing on the optimal tuning curve width. We then extend our results to continuous attractor networks that maintain short-term memory of external stimuli in their dynamics. Here we find similar scaling properties in the structure of the interactions that minimize diffusive information loss. [Preview Abstract] |
Tuesday, March 3, 2015 2:42PM - 2:54PM |
J33.00002: A finite-size ergodic theory of stable chaos for quantifying information processing in balanced state networks of spiking neurons Maximilian Puelma Touzel, Michael Monteforte, Fred Wolf The stability of a dynamics constrains its ability to process information, a notion intended to be captured by the ergodic theory of chaos and one likely to be important for neuroscience. Asynchronous, irregular network activity can be produced by models in which excitatory and inhibitory inputs are balanced [1]. For negative and sharply pulsed interactions, these networks turn out to be stable. The coexistence of aperiodic activity and stability is called stable chaos[2]. This stability to perturbations only exists up to some finite average strength beyond which they are unstable [3]. This finite-size instability produces entropy not captured by conventional ergodic theory. We derive and use the probability of divergence as a function of perturbation strength to give an expression for a finite-sized analogue of the Kolmolgorov-Sinai (KS) entropy that scales with the perturbation strength, and thus deviates from the conventional KS entropy value of 0. This work provides a foundation for understanding the information processing capacity of networks in the fast synapse, fast action potential onset, and inhibition-dominated regime.\\[4pt] [1] van Vreeswijk, C. \& Sompolinsky, Science 274:1724-1726 (1996).\\[0pt] [2] Politi, A. et al. EPL 22,8 (1993).\\[0pt] [3] Monteforte, M. \& Wolf, F., PRX 2,1 (2012). [Preview Abstract] |
Tuesday, March 3, 2015 2:54PM - 3:06PM |
J33.00003: Input spike trains suppress chaos in balanced neural circuits Rainer Engelken, Michael Monteforte, Fred Wolf A longstanding hypothesis claims that structured input in neural circuits enhances reliability of spiking responses. While studies in single neurons well support this hypothesis [Mainen, Sejnowski 1995] the impact of input structure on the dynamics of recurrent networks is not well understood. Earlier studies of the dynamic stability of the balanced state used a constant external input [van Vreeswijk, Sompolinsky 1996, Monteforte, Wolf 2010] or white noise [Lajoie et al. 2014]. We generalize the analysis of dynamical stability for balanced networks driven by input spike trains. An analytical expression for the Jacobian enables us to calculate the full Lyapunov spectrum. We solved the dynamics in numerically exact event-based simulations and calculated Lyapunov spectra, entropy production rate and attractor dimension. We examined the transition from constant to stochastic input in various scenarios. We find a suppression of chaos by input spike trains. We also find that both independent bursty input spike trains and common input more strongly reduces chaos in spiking networks. Our study extends studies of chaotic rate models [Molgedey et al. 1992] to spiking neuron models and opens a novel avenue to study the role of sensory streams in shaping the dynamics of large networks. [Preview Abstract] |
Tuesday, March 3, 2015 3:06PM - 3:18PM |
J33.00004: Low-dimensional stochastic dynamics underly the emergence of spontaneous movement in electric fish Alexandre Melanson, James J. Jun, Jorge F. Mejias, Leonard Maler, Andre Longtin Observing unconstrained animals can lead to simple descriptions of complex behaviours. We apply this principle here to infer the neural basis of spontaneous movements in electric fish. Long-term monitoring of fish in freely swimming, stimuli-free conditions has revealed a sequence of behavioural states that alternate randomly between periods of activity (movement, high active sensing rate) and inactivity (no movement, low active sensing rate). We show that key features of this sequence are well captured by a 1-D diffusion process in a double well energy landscape, where we assume the existence of a slow variable that modulates the relative depth of the wells. Model validation is two-fold: i) state duration distributions are well fitted by exponential mixtures, indicating non-stationary transition rates in the switching process. ii) Monte Carlo simulations with progressive tilting of the double well is consistent with the observed transition-triggered average. We interpret the stochastic variable of this dynamical model as a decision-like variable that, upon reaching a threshold, triggers the transition between states. We thus identify threshold crossing as a possible mechanism for spontaneous movement initiation and offer a dynamical explanation for slower behavioural changes. [Preview Abstract] |
Tuesday, March 3, 2015 3:18PM - 3:30PM |
J33.00005: Millisecond-Scale Motor Encoding in a Cortical Vocal Area Ilya Nemenman, Claire Tang, Diala Chehayeb, Kyle Srivastava, Samuel Sober Studies of motor control have almost universally examined firing rates to investigate how the brain shapes behavior. In principle, however, neurons could encode information through the precise temporal patterning of their spike trains as well as (or instead of) through their firing rates. Although the importance of spike timing has been demonstrated in sensory systems, it is largely unknown whether timing differences in motor areas could affect behavior. We tested the hypothesis that significant information about trial-by-trial variations in behavior is represented by spike timing in the songbird vocal motor system. We found that neurons in motor cortex convey information via spike timing far more often than via spike rate and that the amount of information conveyed at the millisecond timescale greatly exceeds the information available from spike counts. These results demonstrate that information can be represented by spike timing in motor circuits and suggest that timing variations evoke differences in behavior. [Preview Abstract] |
Tuesday, March 3, 2015 3:30PM - 3:42PM |
J33.00006: Mechanical Surface Waves Accompany Action Potential Propagation Benjamin Machta, Ahmed El Hady The action potential (AP) is the basic mechanism by which information is transmitted along neuronal axons. Although the excitable nature of axons is understood to be primarily electrical, many experimental studies have shown that a mechanical displacement of the axonal membrane co-propagates with the electrical signal. While the experimental evidence for co-propagating mechanical waves is diverse and compelling, there is no consensus for their physical underpinnings. We present a model in which these mechanical displacements arise from the driving of mechanical surface waves, in which potential energy is stored in elastic deformations of the neuronal membrane and cytoskeleton while kinetic energy is stored in the movement of the axoplasmic fluid. In our model these surface waves are driven by the traveling wave of electrical depolarization that characterizes the AP, altering the electrostatic forces across the membrane as it passes. Our model allows us to predict the shape of the displacement that should accompany any traveling wave of voltage, including the well-characterized AP. We expect our model to serve as a framework for understanding the physical origins and possible functional roles of these AWs in neurobiology. See Arxiv/1407.7600 [Preview Abstract] |
Tuesday, March 3, 2015 3:42PM - 3:54PM |
J33.00007: Stimulation reveals structural drivers of dynamic brain reorganization Sarah Muldoon, Jean Vettel, Danielle Bassett Understanding the brain as a complex network of interacting components can provide insight into cognitive function. From this perspective, one can study two types of networks: the anatomical network composed of physical connections between neurons or brain regions, and the functional networks constructed from coherent neurophysiological activity. However, the relationship between these two types of networks is far from understood. Do underlying anatomical networks drive functional networks and if so how? Theoretical predictions from linear models suggest that stimulation of certain brain regions can more easily move the brain into different states, forming a type of ``control.'' Yet, the brain is far from a linear system. Using a nonlinear model of brain activity derived from diffusion spectrum imaging of white matter connectivity and Wilson-Cowan dynamics, we test the relationship between regional connectivity patterns and the ability of regional stimulation to impart change in functional network configurations. We find that local regional connectivity relates to network controllability and that the system is sensitive to perturbations in the underlying network structure. [Preview Abstract] |
Tuesday, March 3, 2015 3:54PM - 4:06PM |
J33.00008: Cell fate~variation of neural stem cells treated with carbon nanotubes Massooma Pirbhai, Sabrina Jedlicka, Ming Zheng, Slava V. Rotkin Delivery of materials, such as drug compounds or imaging agents for treatment or diagnosis of disease still presents a biomedical~challenge. Nanotechnological~advances have presented biomedicine with a number of agents that possess the appropriate size and chemistry to pass through the blood brain barrier. Functionalized carbon nanotubes are one such agent, which can potentially aid in drug and gene delivery to the central nervous system. In addition, carbon nanotubes have already been applied in several areas of nerve tissue engineering to probe and augment cell behavior, to label and track subcellular components, and to study the growth and organization of neural networks. Although the production of functionalized carbon nanotubes has escalated in recent years, knowledge of cellular changes associated with exposure to these materials remains unclear. In this study, cellular phenotypes such as proliferation, growth and differentiation in C17.2 neural stem cells have been tested when treated with~single-walled carbon nanotubes functionalized with synthetic ssDNA and RNA. The research has shown irregular behavior in the cell fate which could be due to changes in the cytoskeletal filaments. These results would be worth considering when developing strategies to deliver components to the central nervous system. [Preview Abstract] |
Tuesday, March 3, 2015 4:06PM - 4:18PM |
J33.00009: Chimera States in a Hodgkin-Huxley Model of Thermally Sensitive Neurons Tera Glaze, Scott Lewis, Kenneth Showalter, Sonya Bahar Chimera states, in which identically coupled groups of nonlinear oscillators exhibit very different dynamics, with one group performing synchronized oscillations and the other group showing desynchronized behavior, have recently been studied in computational models. Chimera states have also been demonstrated experimentally in optical and chemical systems. The behavior is particularly relevant in the context of neural synchronization, given the phenomenon of unihemispheric sleep in many animal species, including some mammals, and the recent observation of asymmetric sleep in human patients with sleep apnea. Here, we characterize chimera states using the Huber-Braun neural model, a Hodgkin-Huxley-like model of thermally sensitive neurons. We identify parameter regimes which exhibit chimera behavior and phase cluster states, both in a system with Abrams-Strogatz (mean field) coupling and in a system with Kuramoto (distance-dependent) coupling. [Preview Abstract] |
Tuesday, March 3, 2015 4:18PM - 4:54PM |
J33.00010: Coordinated encoding between cell types in the retina: insights from the theory of phase transitions Invited Speaker: Tatyana Sharpee In this talk I will describe how the emergence of some types of neurons in the brain can be quantitatively described by the theory of transitions between different phases of matter. The two key parameters that control the separation of neurons into subclasses are the mean and standard deviation of noise levels among neurons in the population. The mean noise level plays the role of temperature in the classic theory of phase transitions, whereas the standard deviation is equivalent to pressure, in the case of liquid-gas transitions, or to magnetic field for magnetic transitions. Our results account for properties of two recently discovered types of salamander OFF retinal ganglion cells, as well as the absence of multiple types of ON cells. We further show that, across visual stimulus contrasts, retinal circuits continued to operate near the critical point whose quantitative characteristics matched those expected near a liquid-gas critical point and described by the nearest-neighbor Ising model in three dimensions. Because the retina needs to operate under changing stimulus conditions, the observed parameters of cell types corresponded to metastable states in the region between the spinodal line and the line describing maximally informative solutions. Such properties of neural circuits can maximize information transmission in a given environment while retaining the ability to quickly adapt to a new environment. [Preview Abstract] |
Tuesday, March 3, 2015 4:54PM - 5:06PM |
J33.00011: Optogenetic stimulation of a meso-scale human cortical model Prashanth Selvaraj, Andrew Szeri, Jamie Sleigh, Heidi Kirsch Neurological phenomena like sleep and seizures depend not only on the activity of individual neurons, but on the dynamics of neuron populations as well. Meso-scale models of cortical activity provide a means to study neural dynamics at the level of neuron populations. Additionally, they offer a safe and economical way to test the effects and efficacy of stimulation techniques on the dynamics of the cortex. Here, we use a physiologically relevant meso-scale model of the cortex to study the hypersynchronous activity of neuron populations during epileptic seizures. The model consists of a set of stochastic, highly non-linear partial differential equations. Next, we use optogenetic stimulation to control seizures in a hyperexcited cortex, and to induce seizures in a normally functioning cortex. The high spatial and temporal resolution this method offers makes a strong case for the use of optogenetics in treating meso scale cortical disorders such as epileptic seizures. We use bifurcation analysis to investigate the effect of optogenetic stimulation in the meso scale model, and its efficacy in suppressing the non-linear dynamics of seizures. [Preview Abstract] |
Tuesday, March 3, 2015 5:06PM - 5:18PM |
J33.00012: Pore opening dynamics in the exocytosis of serotonin Guillermo Ramirez-Santiago, Montserrat G. Cercos, Alejandro Martinez-Valencia, Israel Salinas Hernandez, Leonardo Rodr\'Iguez-Sosa, Francisco F. De-Miguel The current view of the exocytosis of transmitter molecules is that it starts with the formation of a fusion pore that connects the intravesicular and the extracellular spaces, and is completed by the release of the rest of the transmitter contained in the vesicle upon the full fusion and collapse of the vesicle with the plasma membrane. However, under certain circumstances, a rapid closure of the pore before the full vesicle fusion produces only a partial release of the transmitter. Here we show that whole release of the transmitter occurs through fusion pores that remain opened for tens of milliseconds without vesicle collapse. This was demonstrated through amperometric measurements of serotonin release from electrodense vesicles in the axon of leech Retzius neurons and mathematical modelling. By modeling transmitter release with a diffusion equation subjected to boundary conditions that are defined by the experiment, we showed that those pores with a fast half rise time constant remained opened and allowed the full quantum release without vesicle collapse, whereas pores with a slow rise time constant closed rapidly, thus producing partial release. We conclude that a full transmitter release may occur through the fusion pore in the absence of vesicle collapse. [Preview Abstract] |
Tuesday, March 3, 2015 5:18PM - 5:30PM |
J33.00013: Synaptic connectivity and spatial memory: a topological approach Russell Milton, Andrey Babichev, Yuri Dabaghian In the hippocampus, a network of place cells generates a cognitive map of space, in which each cell is responsive to a particular area of the environment -- its place field. The peak response of each cell and the size of each place field have considerable variability. Experimental evidence suggests that place cells encode a topological map of space that serves as a basis of spatial memory and spatial awareness. Using a computational model based on Persistent Homology Theory we demonstrate that if the parameters of the place cells spiking activity fall inside of the physiological range, the network correctly encodes the topological features of the environment. We next introduce parameters of synaptic connectivity into the model and demonstrate that failures in synapses that detect coincident neuronal activity lead to spatial learning deficiencies similar to the ones that are observed in rodent models of neurodegenerative diseases. Moreover, we show that these learning deficiencies may be mitigated by increasing the number of active cells and/or by increasing their firing rate, suggesting the existence of a compensatory mechanism inherent to the cognitive map. [Preview Abstract] |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700