Bulletin of the American Physical Society
APS March Meeting 2016
Volume 61, Number 2
Monday–Friday, March 14–18, 2016; Baltimore, Maryland
Session E41: Physics of Neural Systems |
Hide Abstracts |
Sponsoring Units: DBIO Chair: Marc Howard, Boston University Room: 344 |
Tuesday, March 15, 2016 8:00AM - 8:12AM |
E41.00001: An objective function for Hebbian self-limiting synaptic plasticity rules Claudius Gros, Samuel Eckmann, Rodrigo Echeveste Objective functions, formulated in terms of information theoretical measures with respect to the input and output probability distributions, provide a useful framework for the formulation of guiding principles for information processing systems, such as neural networks. In the present work, a guiding principle for neural plasticity is formulated in terms of an objective function expressed as the Fisher information with respect to an operator that we denote as the synaptic flux \footnote{Echeveste $\&$ Gros, \textbf{Front. Robot. AI} 1, 2014}. By minimization of this objective function, we obtain Hebbian self-limiting synaptic plasticity rules, avoiding unbounded weight growth. Furthermore, we show how the rules are selective to directions of maximal negative excess kurtosis, making them suitable for independent component analysis. As an application, the non-linear bars problem \footnote{F\"{o}ldiak, \textbf{Biol. Cybern.} 64: 165–170, 1990} is studied, in which each neuron is presented with a non-linear superposition of horizontal and vertical bars. We show that, under the here presented rules, the neurons are able to find the independent components of the input. [Preview Abstract] |
Tuesday, March 15, 2016 8:12AM - 8:24AM |
E41.00002: Calculation of correlation function of a spatially coupled spiking neural network Siwei Qiu, Carson Chow The dynamics of a large but finite number of coupled spiking neurons is not well understood. ~We analyze finite size effects in a network of synaptically coupled theta neurons. ~We show how the system can be characterized by a functional integral from which finite size effects are calculated perturbatively. ~We discuss the implications of this technique for bump attractors. [Preview Abstract] |
Tuesday, March 15, 2016 8:24AM - 8:36AM |
E41.00003: The role of symmetry in the regulation of brain dynamics. Evelyn Tang, Chad Giusti, Matthew Cieslak, Scott Grafton, Danielle Bassett Synchronous neural processes regulate a wide range of behaviors from attention to learning. Yet structural constraints on these processes are far from understood. We draw on new theoretical links between structural symmetries and the control of synchronous function, to offer a reconceptualization of the relationships between brain structure and function in human and non-human primates. By classifying 3-node motifs in macaque connectivity data, we find the most prevalent motifs can theoretically ensure a diversity of function including strict synchrony as well as control to arbitrary states. The least prevalent motifs are theoretically controllable to arbitrary states, which may not be desirable in a biological system. In humans, regions with high topological similarity of connections (a continuous notion related to symmetry) are most commonly found in fronto-parietal systems, which may account for their critical role in cognitive control. Collectively, our work underscores the role of symmetry and topological similarity in regulating dynamics of brain function. [Preview Abstract] |
Tuesday, March 15, 2016 8:36AM - 8:48AM |
E41.00004: Synchronization in a non-uniform network of excitatory spiking neurons. Rodrigo Echeveste, Claudius Gros Spontaneous synchronization of pulse coupled elements is ubiquitous in nature and seems to be of vital importance for life \footnote{Strogatz $\&$ Stewart, \textbf{Sci. Am.} 269(6): 102-109, 1993.}. Networks of pacemaker cells in the heart \footnote{Peskin, \textbf{Mathematical aspects of heart physiology}, Courant Institute of Mathematical Sciences, New York University, 1975.}, extended populations of southeast asian fireflies \footnote{Buck, \textbf{Quarterly review of biology} 265-289, 1988.}, and neuronal oscillations in cortical networks \footnote{Buzsaki $\&$ Draguhn, \textbf{Science} 304(5679): 1926-1929, 2004.}, are examples of this. In the present work, a rich repertoire of dynamical states with different degrees of synchronization are found in a network of excitatory-only spiking neurons connected in a non-uniform fashion. In particular, uncorrelated and partially correlated states are found without the need for inhibitory neurons or external currents. The phase transitions between these states, as well the robustness, stability, and response of the network to external stimulus are studied. [Preview Abstract] |
Tuesday, March 15, 2016 8:48AM - 9:00AM |
E41.00005: Information Transmission and Anderson Localization in two-dimensional networks of firing-rate neurons Joseph Natale, George Hentschel Firing-rate networks offer a coarse model of signal propagation in the brain. Here we analyze sparse, 2D planar firing-rate networks with no synapses beyond a certain cutoff distance. Additionally, we impose Dale's Principle to ensure that each neuron makes only or inhibitory outgoing connections. Using spectral methods, we find that the number of neurons participating in excitations of the network becomes insignificant whenever the connectivity cutoff is tuned to a value near or below the average interneuron separation. Further, neural activations exceeding a certain threshold stay confined to a small region of space. This behavior is an instance of Anderson localization, a disorder-induced phase transition by which an information channel is rendered unable to transmit signals. We discuss several potential implications of localization for both local and long-range computation in the brain. [Preview Abstract] |
Tuesday, March 15, 2016 9:00AM - 9:12AM |
E41.00006: Controlling chaos in balanced neural circuits with input spike trains Rainer Engelken, Fred Wolf The cerebral cortex can be seen as a system of neural circuits driving each other with spike trains. Here we study how the statistics of these spike trains affects chaos in balanced target circuits.\\Earlier studies of chaos in balanced neural circuits either used a fixed input [van Vreeswijk, Sompolinsky 1996, Monteforte, Wolf 2010] or white noise [Lajoie et al. 2014]. \\ We study dynamical stability of balanced networks driven by input spike trains with variable statistics. The analytically obtained Jacobian enables us to calculate the complete Lyapunov spectrum. We solved the dynamics in event-based simulations and calculated Lyapunov spectra, entropy production rate and attractor dimension. We vary correlations, irregularity, coupling strength and spike rate of the input and action potential onset rapidness of recurrent neurons.\\ We generally find a suppression of chaos by input spike trains. This is strengthened by bursty and correlated input spike trains and increased action potential onset rapidness. We find a link between response reliability and the Lyapunov spectrum. Our study extends findings in chaotic rate models [Molgedey et al. 1992] to spiking neuron models and opens a novel avenue to study the role of projections in shaping the dynamics of large neural circuits. [Preview Abstract] |
Tuesday, March 15, 2016 9:12AM - 9:24AM |
E41.00007: Simulation of dendritic growth reveals necessary and sufficient parameters to describe the shapes of dendritic trees. Olivier Trottier, Sujoy Ganguly, Hugo Bowne-Anderson, Xin Liang, Jonathon Howard For the last 120 years, the development of neuronal shapes has been of great interest to the scientific community. Over the last 30 years, significant work has been done on the molecular processes responsible for dendritic development. In our ongoing research, we use the class IV sensory neurons of the \textit{Drosophila melanogaster} larva as a model system to understand the growth of dendritic arbors. Our main goal is to elucidate the mechanisms that the neuron uses to determine the shape of its dendritic tree. We have observed the development of the class IV neuron's dendritic tree in the larval stage and have concluded that morphogenesis is defined by 3 distinct processes: 1) branch growth, 2) branching and 3) branch retraction. As the first step towards understanding dendritic growth, we have implemented these three processes in a computational model. Our simulations are able to reproduce the branch length distribution, number of branches and fractal dimension of the class IV neurons for a small range of parameters. [Preview Abstract] |
Tuesday, March 15, 2016 9:24AM - 9:36AM |
E41.00008: Partial Synchronization in Pulse-Coupled Oscillator Networks I: Theory Jan Engelbrecht, Bolun Chen, Renato Mirollo We study $N$ identical integrate and fire model neurons coupled in an all to all network through $\alpha$-function pulses, weighted by a parameter $K$. Studies of the dynamics of this system often focus on the stability of the fully synchronous and the fully asynchronous splay states, that naturally depend on the sign of $K$, i.e. excitation vs inhibition. We find that for finite $N$ there is a rich set of other partially synchronized attractors, such as $(N-1,1)$ fixed states and partially synchronized splay states. Our framework exploits the neutrality of the dynamics for $K=0$ which allows us to implement a dimensional reduction strategy that replaces the discrete pulses with a continuous flow, with the sign of $K$ determining the flow direction. This framework naturally incorporates a hierarchy of partially synchronized subspaces in which the new states lie. For $N=2,\;3,\;4$, we completely describe the sequence of bifurcations and the stability of all fixed points and limit cycles. [Preview Abstract] |
Tuesday, March 15, 2016 9:36AM - 9:48AM |
E41.00009: Partial Synchronization in Pulse-Coupled Oscillator Networks II: A Numerical Study Bolun Chen, Jan R. Engelbrecht, Renato Mirollo We use high-precision numerical simulations, to compute the dynamics of $N$ identical integrate and fire model neurons coupled in an all-to-all network through $\alpha$-function pulses. In particular, we determine the discrete evolution of the state of our system from spike to spike. In addition to traditional fully synchronous and splay states, we exhibit multiple competing partially synchronized ordered states, which are fixed points and limit cycles in the phase space. Close examinations reveal the bifurcations among different states. By varying the parameters, we map out the phase diagram of stable fixed points. Our results illustrate the power of dimensional reduction in complex dynamical systems, and shed light on the collective behaviors of neural networks. [Preview Abstract] |
Tuesday, March 15, 2016 9:48AM - 10:00AM |
E41.00010: Critical behavior of large maximally informative neural populations John Berkowitz, Tatyana Sharpee We consider maximally informative encoding of scalar signals by neural populations. In a small time window, neural responses are binary, with spiking probability that follows a sigmoidal tuning curve. The width of the tuning curve represents effective noise in neural transmission. Previous analyses of this problem for relatively small numbers of neurons with identical noise parameters indicated the presence of multiple bifurcations that occurred with decreasing noise value. For very high noise values, maximal information is achieved when all neurons have the same threshold values. With decreasing noise, the threshold values split into two or more groups via a series of bifurcations, until finally each neuron has a different threshold. Analyzing this problem in the large N limit, we found instead that there is a single phase transition from redundant coding to coding based on distributed thresholds. The order parameter of this transition is the threshold standard deviation across the population; differences in noise parameter from the mean are analogous to local magnetic fields. Near the bifurcation point, information transmitted follows a Landau expansion. We use this expansion to quantify the scaling of the order parameter with noise and effective magnetic field. [Preview Abstract] |
Tuesday, March 15, 2016 10:00AM - 10:12AM |
E41.00011: Spike frequency adaptation is a possible mechanism for control of attractor preference in auto-associative neural networks. James Roach, Leonard Sander, Michal Zochowski Auto-associative memory is the ability to retrieve a pattern from a small fraction of the pattern and is an important function of neural networks. Within this context, memories that are stored within the synaptic strengths of networks act as dynamical attractors for network firing patterns. In networks with many encoded memories, some attractors will be stronger than others. This presents the problem of how networks switch between attractors depending on the situation. We suggest that regulation of neuronal spike-frequency adaptation (SFA) provides a universal mechanism for network-wide attractor selectivity. Here we demonstrate in a Hopfield type attractor network that neurons minimal SFA will reliably activate in the pattern corresponding to a local attractor and that a moderate increase in SFA leads to the network to converge to the strongest attractor state. Furthermore, we show that on long time scales SFA allows for temporal sequences of activation to emerge. Finally, using a model of cholinergic modulation within the cortex we argue that dynamic regulation of attractor preference by SFA could be critical for the role of acetylcholine in attention or for arousal states in general. [Preview Abstract] |
Tuesday, March 15, 2016 10:12AM - 10:24AM |
E41.00012: On the Emergent Properties of Recurrent Neural Networks at Criticality Yahya Karimipanah, Zhengyu Ma, Ralf Wessel Irregular spiking is a widespread phenomenon in neuronal activities in vivo. In addition, it has been shown that the firing rate variability decreases after the onset of external stimuli. Since these are known as two universal features of cortical activity, it is natural to ask whether there is a universal mechanism underlying such phenomena. Independently, there has been mounting evidence that superficial layers of cortex operate near a second-order phase transition (critical point), which is manifested in the form of scale free activity. However, despite the strong evidence for such a criticality hypothesis, it is still very little known on how it can be leveraged to facilitate neural coding. As the decline in response variability is regarded as an essential mechanism to enhance coding efficiency, we asked whether the criticality hypothesis could bridge between scale free activity and other ubiquitous features of cortical activity. Using a simple binary probabilistic model, we show that irregular spiking and decline in response variability, both arise as emergent properties of a recurrent network poised at criticality. Our results provide us with a unified explanation for the ubiquity of these two features, without a need to exploit any further mechanism. [Preview Abstract] |
Tuesday, March 15, 2016 10:24AM - 10:36AM |
E41.00013: Computing with scale-invariant neural representations Marc Howard, Karthik Shankar The Weber-Fechner law is perhaps the oldest quantitative relationship in psychology. Consider the problem of the brain representing a function $f(x)$. Different neurons have receptive fields that support different parts of the range, such that the $i$th neuron has a receptive field at $x_i$. Weber-Fechner scaling refers to the finding that the width of the receptive field scales with $x_i$ as does the difference between the centers of adjacent receptive fields. Weber-Fechner scaling is exponentially resource-conserving. Neurophysiological evidence suggests that neural representations obey Weber-Fechner scaling in the visual system and perhaps other systems as well. We describe an optimality constraint that is solved by Weber-Fechner scaling, providing an information-theoretic rationale for this principle of neural coding. Weber-Fechner scaling can be generated within a mathematical framework using the Laplace transform. Within this framework, simple computations such as translation, correlation and cross-correlation can be accomplished. This framework can in principle be extended to provide a general computational language for brain-inspired cognitive computation on scale-invariant representations. [Preview Abstract] |
Tuesday, March 15, 2016 10:36AM - 10:48AM |
E41.00014: ABSTRACT WITHDRAWN |
Tuesday, March 15, 2016 10:48AM - 11:00AM |
E41.00015: Physics of Intrinsic and Extrinsic Factors that Cause the Onset of the Deadliest Illness of Mankind and are Important for Diagnostics and Treatment ARJUN SAXENA One of the most important topic of research in the field of Physics of Behavior is the deadliest illness of mankind which is the group of illnesses called mental illnesses. They are getting attention increasingly worldwide by the medical communities and their respective governments, because of the following fact. It is now well established that these illnesses cause more loss of human lives, destruction of families, businesses and overall economy than all the other illnesses combined. ~The purpose of this paper is to identify and provide solutions to two fundamental issues of such illnesses which still remain as problems. One is the stigma associated with them because of their name ``mental''. The patients are regarded as less than normal because their illness is only ``mental'' in origin. The second is that it is still not widely recognized that they are caused by medical problems in their ``brain'' which afflict their ``mind''. This paper explains this and gives an improved 3-D model using the physics of intrinsic and extrinsic factors of both ``brain'' and ``mind''. It leads to an important new name, ``BAMI'' (Brain and Mind Illness), which eliminates the stigma and gives quantitative parameters to diagnose the illness and monitor medicines to treat such illnesses. [Preview Abstract] |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700