Bulletin of the American Physical Society
APS March Meeting 2019
Volume 64, Number 2
Monday–Friday, March 4–8, 2019; Boston, Massachusetts
Session P67: Physics of Neural Systems IFocus
|
Hide Abstracts |
Sponsoring Units: DBIO Chair: Joshua Shaevitz, Princeton University Room: BCEC 050 |
Wednesday, March 6, 2019 2:30PM - 3:06PM |
P67.00001: What the odor is not: Estimation by elimination Invited Speaker: Vijay Singh The olfactory system is thought to encode vast numbers of odors combinatorially in the responses of a much smaller number of broadly sensitive receptors. Here, we propose a method for decoding such a distributed representation. The main idea is that it is much easier to identify what the odor is not, rather than what the odor is. This is because a typical receptor binds to many odorants; so a response below threshold signals the absence of all such odorants. We demonstrate that, for biologically realistic numbers of receptors, response functions, and odor mixture complexity, this remarkably simple method of elimination turns an underdetermined decoding problem into an overdetermined one, allowing accurate determination of the odorants in a mixture and their concentrations. We give a simple neural network realization of our algorithm which resembles the known circuit architecture of the piriform cortex. |
Wednesday, March 6, 2019 3:06PM - 3:18PM |
P67.00002: Quantifying Multi-neuronal Olfactory Responses in C. elegans Albert Lin, Vivek Venkatachalam, Wesley Hung, Min Wu, Mei Zhen, Aravinthan Samuel Complex animal behaviors arise in response to an organism’s environment, as perceived through its senses. Unlike other stimuli animals experience, such as light or sound, chemosensory stimuli form a high dimensional space, making the computational problem faced by organisms interpreting olfactory cues especially complex. The processes by which olfactory information is encoded are poorly understood. |
Wednesday, March 6, 2019 3:18PM - 3:30PM |
P67.00003: Seizure Prediction with Machine Learning using Real and Simulated Electrocorticography Data Louis Nemzer, Robert Worth, Gary D Cravens, Victor Castro, Andon Placzek, Kristina Bolt Epilepsy is the most common chronic neurological disorder, affecting approximately one percent of people worldwide. Patients with symptoms not well controlled with medication often suffer significantly reduced quality of life due to the unpredictable nature of seizures, which are periods of pathological synchronization of neural activity in the brain. Using a surgically-implanted intracranial electrode grid, electrocorticography (ECoG) provides better spatial and temporal resolution of brain electrical activity, compared with conventional scalp electroencephalography (EEG). We combine this patient data with simulated output from a full Hodgkin-Huxley calculation using in silico neurons connected with a small-world network topology. Supervised Machine Learning, a set of powerful and flexible artificial intelligence techniques that allow computers to classify complex data without the need for explicit programming, along with topological data analysis methods, are employed with a goal of developing an algorithm that can be used for the real-time clinical prediction of seizure risk. |
Wednesday, March 6, 2019 3:30PM - 3:42PM |
P67.00004: Attractor-state itinerancy in neural circuits with synaptic depression Bolun Chen, Paul Miller Neural populations with strong excitatory recurrent connections support bistable states in their mean firing rates. Multiple fixed points in a network of such bistable units can be used to model memory retrieval and pattern separation. The stability of fixed points may change on a slower timescale than that of the dynamics due to short-term synaptic depression. This leads to state-transitions that depend on the history of stimuli. We study a minimal model which characterizes multiple fixed points and transitions in response to diverse stimuli. We show that the slow synaptic depression introduces multiple time-scales. The interplay between fast and slow dynamics of synaptic input and depression makes the system’s response sensitive to the amplitude and duration of square-pulse stimuli in a history-dependent manner. Weak cross-couplings further deform the basins of attraction for different fixed points into intricate and even fractal-like shapes. Our analysis provides a natural explanation for the system’s rich responses to stimuli with different duration and amplitudes while demonstrating the encoding capability of bistable neural populations for dynamical features. |
Wednesday, March 6, 2019 3:42PM - 3:54PM |
P67.00005: Nonlinear Brain Dynamics via Neural Field Theory Peter Robinson Going beyond well-known nonlinear effects at the single-neuron level , nonlinear effects at the systems and whole-brain levels are manifest in epileptic seizures (Hopf bifurcations, limit cycles, saddle cycles), migraines and visual hallucinations (Turing and Hopf-Turing patterns), strong visual stimulation (harmonic and subharmonic generation, phase locking, entrainment), deep brain stimulation therapy of Parkinson’s disease (entrainment, harmonic and subharmonic generation, resonance suppression), the natural 10 Hz alpha rhythm (bistability), and normal sleep-wake dynamics (near-criticality, hysteresis, critical slowing). |
Wednesday, March 6, 2019 3:54PM - 4:06PM |
P67.00006: Dynamics of excitable tree networks: Application to sensory neurons Ali Khaledi Nasab, Justus Kromer, Lutz Schimansky-Geier, Alexander Neiman We study the collective dynamics of diffusively coupled excitable elements on small tree networks with regular and random connectivity, which model sensory neurons with branched myelinated distal terminals. Examples of such neurons include touch receptors, muscle spindles, and some electroreceptors. We developed a theory that predicts the collective spiking activity in the physiologically-relevant strong coupling limit. We show that the mechanism of coherent firing is rooted in the synchronization of local activity of individual nodes, even though peripheral nodes may receive random independent inputs. The structural variability in random tree networks translates into collective network dynamics leading to a wide range of firing rates and coefficients of variations, which is most pronounced in the strong coupling regime. |
Wednesday, March 6, 2019 4:06PM - 4:18PM |
P67.00007: Role of geometrical cues in neuronal growth Cristian Staii I will present a systematic experimental and theoretical investigation of neuronal growth on micro-patterned surfaces. The experimental results show that neurons cultured on surfaces with periodic geometrical patterns display a significant increase in the total length of axons, as well as axonal alignment along preferred growth directions, which are controlled by the surface geometry. We demonstrate that axonal dynamics is governed by non-linear stochastic differential equations, and use this theoretical model to measure key dynamical parameters: angular distributions, correlation functions, diffusion coefficients and cell-surface coupling forces. Our results show that neuronal alignment on these surfaces is completely determined by the surface geometry, and that the growth dynamics can be quantified by a minimal set of experimentally measurable parameters. I will also discuss how to generalize this approach to include mechanical and biochemical external cues, and introduce a general framework for the quantitative description of neuronal growth and self-organization in complex environments. |
Wednesday, March 6, 2019 4:18PM - 4:30PM |
P67.00008: Long-lasting desynchronization achieved by decoupling stimulation Justus Kromer, Peter A. Tass
|
Wednesday, March 6, 2019 4:30PM - 4:42PM |
P67.00009: Cooling Reverses Pathological Bifurcations to Spontaneous Firing Caused by Mild Traumatic Injury Benjamin Barlow, Bela Joos, Anh-Khoi Trinh, Andre Longtin Experimental studies have revealed that mild trauma in the form of e.g. physical pressure or chemical stimuli can alter the properties of the main current (sodium) responsible for the voltage swings or “firings” of neurons. This leads to ongoing firings even when the cell should be quiescent. Such pathological firing interferes with the usual input integration properties of the cell, and in particular has been implicated in the genesis of pathological pain, which persists even after the injury-producing stimulus is removed. From a dynamical point of view, this mild trauma lowers the threshold for firing. This presentation explores the possibility of using temperature to offset this effect by raising the firing threshold back up (see Chaos, 28, 106328 (2018)). Our modeling study predicts that cooling the neuron by just a few degrees – as is possible e.g. for peripheral nerve cells – can counteract the pathological state. The sensitivity of the sodium current to temperature is the key determinant of this effect. |
Wednesday, March 6, 2019 4:42PM - 4:54PM |
P67.00010: Utilizing network analysis and fMRI to infer key language modules and their circuits from healthy human controls Qiongge Li, Gino Del Ferraro, Luca Pasquini, Kyung K. Peck, Hernán A. Makse, Andrei I. Holodny Traditional task-based functional Magnetic Resonance Imaging (tb-fMRI) statistical analysis has served as a powerful tool to identify brain areas associated with language. However, it does not provide an explanation of how different functional areas interact and integrate with each other to form comprehensive language tasks. |
Wednesday, March 6, 2019 4:54PM - 5:06PM |
P67.00011: The effects of inhibitory neuron fraction on the dynamics of an avalanching neural network Jacob Carroll, Ada Warren, Uwe Claus Tauber The statistical analysis of the collective neural activity known as avalanches provides insight |
Wednesday, March 6, 2019 5:06PM - 5:18PM |
P67.00012: Quantitative Characterization of Neuromorphic Neural Circuits Jason Platt, Jun Wang, Henry D. I. Abarbanel, Gert Cauwenberghs NeuroDyn is a neuromorphic very large scale integrated circuit (VLSI) capable of modelling four interconnected Hodgkin-Huxley like neurons coupled through twelve chemical synapses. The 384 digitally programmable parameter space specifies ion conductances, reversal potentials and ion channel gating variables. Errors during the manufacturing process can result in a large mismatch between a specified design parameter and the value realized in the hardware. Statistical data assimilation (SDA) is a technique that can estimate parameters in a non-linear dynamical system. By inputting a current designed to probe the full dynamical range of the chip and then measuring the four state variables of the NaKL Neuron (V, m, h, n) we can estimate the mismatch between the programmed and physical parameters. Characterization of the errors in the VLSI chip will help standardize and render useful all manufactured neuromorphic chips such that they can interchangeably be used for applications and research. |
Wednesday, March 6, 2019 5:18PM - 5:30PM |
P67.00013: Neural Correlates of Cognition in Primary Visual versus Downstream Posterior Cortices During Evidence Accumulation Sue Ann Koay, David W Tank, Carlos D Brody The ability of animals to accumulate sensory information across time is fundamental to decision-making. Using a mouse behavioral paradigm where navigational decisions are based on accumulating pulses of visual cues, I compared neural activity in primary visual (V1) to secondary visual and retrosplenial cortices. Even in V1, only a small fraction of neurons had sensory-like responses to cues. Instead, all areas were grossly similar in how neural populations contained a large variety of task-related information from sensory to cognitive, including cue timings, accumulated counts, place/time, decision and reward outcome. Across-trial influences were prevalent, possibly relevant to how animal behavior incorporates past contexts. Intriguingly, all these variables also modulated the amplitudes of sensory responses. While previous work often modeled the accumulation process as integration, the observed scaling of sensory responses by accumulated counts instead suggests a recursive process where sensory responses are gradually amplified. I show that such a multiplicative feedback-loop algorithm better explains psychophysical data than integration, particularly in how the performance transitions into following Weber-Fechner’s Law only at high counts. |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2023 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
1 Research Road, Ridge, NY 11961-2701
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700