Bulletin of the American Physical Society
APS March Meeting 2016
Volume 61, Number 2
Monday–Friday, March 14–18, 2016; Baltimore, Maryland
Session H55: Theoretical Physics and Networks of Real NeuronsInvited Undergraduate
|
Hide Abstracts |
Sponsoring Units: DBIO Chair: William Bialek, Princeton University Room: Hilton Baltimore Holiday Ballroom 6 |
Tuesday, March 15, 2016 2:30PM - 3:06PM |
H55.00001: Neural circuit mechanisms of short-term memory Invited Speaker: Mark Goldman Memory over time scales of seconds to tens of seconds is thought to be maintained by neural activity that is triggered by a memorized stimulus and persists long after the stimulus is turned off. This presents a challenge to current models of memory-storing mechanisms, because the typical time scales associated with cellular and synaptic dynamics are two orders of magnitude smaller than this. While such long time scales can easily be achieved by bistable processes that toggle like a flip-flop between a baseline and elevated-activity state, many neuronal systems have been observed experimentally to be capable of maintaining a continuum of stable states. For example, in neural integrator networks involved in the accumulation of evidence for decision making and in motor control, individual neurons have been recorded whose activity reflects the mathematical integral of their inputs; in the absence of input, these neurons sustain activity at a level proportional to the running total of their inputs. This represents an analog form of memory whose dynamics can be conceptualized through an energy landscape with a continuum of lowest-energy states. Such continuous attractor landscapes are structurally non-robust, in seeming violation of the relative robustness of biological memory systems. In this talk, I will present and compare different biologically motivated circuit motifs for the accumulation and storage of signals in short-term memory. Challenges to generating robust memory maintenance will be highlighted and potential mechanisms for ameliorating the sensitivity of memory networks to perturbations will be discussed. [Preview Abstract] |
Tuesday, March 15, 2016 3:06PM - 3:42PM |
H55.00002: Dynamical criticality in the collective activity of a neural population Invited Speaker: Thierry Mora The past decade has seen a wealth of physiological data suggesting that neural networks may behave like critical branching processes. Concurrently, the collective activity of neurons has been studied using explicit mappings to classic statistical mechanics models such as disordered Ising models, allowing for the study of their thermodynamics, but these efforts have ignored the dynamical nature of neural activity. I will show how to reconcile these two approaches by learning effective statistical mechanics models of the full history of the collective activity of a neuron population directly from physiological data, treating time as an additional dimension. Applying this technique to multi-electrode recordings from retinal ganglion cells, and studying the thermodynamics of the inferred model, reveals a peak in specific heat reminiscent of a second-order phase transition. [Preview Abstract] |
Tuesday, March 15, 2016 3:42PM - 4:18PM |
H55.00003: Adaptation on multiple time scales Invited Speaker: Adirenne Fairhall |
Tuesday, March 15, 2016 4:18PM - 4:54PM |
H55.00004: A theory of neural dimensionality, dynamics, and measurement. Invited Speaker: Surya Ganguli In many experiments, neuroscientists tightly control behavior, record many trials, and obtain trial-averaged firing rates from hundreds of neurons in circuits containing millions of behaviorally relevant neurons. Dimensionality reduction has often shown that such datasets are strikingly simple; they can be described using a much smaller number of dimensions than the number of recorded neurons, and the resulting projections onto these dimensions yield a remarkably insightful dynamical portrait of circuit computation.~This ubiquitous simplicity raises several profound and timely conceptual questions. What is the origin of this simplicity and its implications for the complexity of brain dynamics? Would neuronal datasets become more complex if we recorded more neurons? How and when can we trust dynamical portraits obtained from only hundreds of neurons in circuits containing millions of neurons? We present a theory that answers these questions, and test it using neural data recorded from reaching monkeys. ~Overall, this theory yields a picture of the neural measurement process as a random projection of neural dynamics, conceptual insights into how we can reliably recover dynamical portraits in such under-sampled measurement regimes, and quantitative guidelines for the design of future experiments.~Moreover, it reveals the existence of phase transition boundaries in our ability to successfully decode cognition and behavior as a function of the number of recorded neurons, the complexity of the task, and the smoothness of neural dynamics. [Preview Abstract] |
Tuesday, March 15, 2016 4:54PM - 5:30PM |
H55.00005: Understanding vision through the lens of prediction Invited Speaker: Stephanie Palmer Prediction is necessary for long-term planning and decision-making, but prediction is also essential to the short-term calculations necessary to overcome the sensory and motor delays present in all neural systems. In order to interact appropriately with a changing environment, the brain must respond not only to the current state of sensory inputs but to rapid predictions of these inputs' future state. To test whether the visual system performs optimal predictive compression and computation, we compute the past and future stimulus information in populations of retinal ganglion cells, the output cells of the retina, in salamanders and rats. By controlling the motion statistics of the input stimulus presented to the retina, a moving bar with inertia making a random walk in space, we can derive the optimal tradeoff between compressing information about the past stimulus while retaining as much information as possible about the future stimulus. By changing parameters in the equation of motion for the bar, we can explore qualitatively different motion prediction problems. We show that retinal ganglion cells sit near this optimum for some motion types but not others, and compare these results between the two sampled species. Taking the next step towards exploring the predictive capacity of neural systems, we characterize the ensemble of spatiotemporal correlations present in the natural environment. To do so, we construct and analyze a database of natural motion videos. We have made high-speed, high-pixel-depth recordings of natural scenes and have preliminary data quantifying the space-time power spectra and the local motion content of these scenes. [Preview Abstract] |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700