Bulletin of the American Physical Society
2009 APS March Meeting
Volume 54, Number 1
Monday–Friday, March 16–20, 2009; Pittsburgh, Pennsylvania
Session W7: Information Theory in Biology |
Hide Abstracts |
Sponsoring Units: DBP Chair: Ned Wingreen, Princeton University Room: 407 |
Thursday, March 19, 2009 11:15AM - 11:51AM |
W7.00001: Optimizing information flow in biological networks Invited Speaker: The generation of physicists who turned to the phenomena of life in the 1930s realized that to understand these phenomena one would need to track not just the flow of energy (as in inanimate systems) but also the flow of information. It would take more than a decade before Shannon provided the tools to formalize this intuition, making precise the connection between entropy and information. Since Shannon, many investigators have explored the possibility that biological mechanisms are selected to maximize the efficiency with which information is transmitted or represented, subject to fundamental physical constraints. I will survey these efforts, emphasizing that the same principles are being used in thinking about biological systems at very different levels of organization, from bacteria to brains. Although sometimes submerged under concerns about particular systems, the idea that information flow is optimized provides us with a candidate for a real theory of biological networks, rather than just a collection of parameterized models. I will try to explain why I think the time is right to focus on this grand theoretical goal, pointing to some key open problems and opportunities for connection to emerging experiments. [Preview Abstract] |
Thursday, March 19, 2009 11:51AM - 12:27PM |
W7.00002: Form, Function, and Information Processing in Stochastic Regulatory Networks Invited Speaker: The ability of a biological network to transduce signals, e.g., from chemical information about the abundance of small molecules into regulatory information about the rate of mRNA expression, is thwarted by numerous sources of noise. A great amount has been learned and conjectured in the last decade about the extent to which the form of a network --- specified by the connectivity and sign of regulation --- constrains or guides the networks function --- the particular noisy input-output relation(s) the network is capable of executing. In parallel, a great amount of research has sought to elucidate the role of inescapable or 'intrinsic' noise arising from the finite copy number of the participating molecules, which sets physical limits on information processing in small cells. I'll discuss how information theory may help illuminate these topics by providing a framework for quantifying function which does not rely on specifying the particular task to be performed a priori, as well as by providing a measure for the extent to which form follows function. En route I hope to show how stochastic chemical kinetics, modeled by the (linear) master equation describing the probability of copy counts for all reactants, benefits from the same spectral approaches fundamental to solving the (linear) diffusion equation. [Preview Abstract] |
Thursday, March 19, 2009 12:27PM - 1:03PM |
W7.00003: Decision boundaries for maximizing information transmission in neural circuits Invited Speaker: Everything we know about the world around us is represented in the nervous system in sequences of discrete electrical pulses termed spikes. One attractive theoretical idea, going back to 1950s, is that these representations are efficient in the sense of information theory. I will describe an approach for finding the optimal coupling strengths between different neurons that is based on a concept of a decision boundary [1]. In this framework, neural circuit responses are described by specifiying for each neuron the decision boundary that separates multi-dimensional signals that elicit a spike in that neuron from those signals that do not elicit a spike. The shape and position of individual neurons' boundaries determine the amount of mutual information that the neural circuit can transmit about the incoming signals. Correspondingly, the optimal configuration of the decision boundaries depends on the probability distribution of incoming signals. Signals typical of our natural sensory environment are known to be strongly correlated and to possess large-amplitude deviations that are often better described by an exponential rather than a Gaussian distribution. Considering exponentially distributed signals, we find that optimal decision boundaries of neurons are curved, and that they exhibit sharp discontinuities when decision boundaries of different neurons intersect. This, in turn, corresponds to non-zero coupling constants when these neural circuits are described using the pairwise maximum entropy models. \\[4pt] [1] T. Sharpee and W. Bialek, PLoS One, e646 (2007). [Preview Abstract] |
Thursday, March 19, 2009 1:03PM - 1:39PM |
W7.00004: Information processing and signal integration in bacterial quorum sensing Invited Speaker: Bacteria communicate with each other using secreted chemical signaling molecules called autoinducers (AIs) in a process known as quorum sensing. Quorum sensing enables bacteria to collectively regulate their behavior depending on the number and/or species of bacteria present. The quorum-sensing network of the marine-bacteria {\it Vibrio harveyi} consists of three AIs encoding distinct ecological information, each detected by its own histidine-kinase sensor protein. The sensor proteins all phosphorylate a common response regulator and transmit sensory information through a shared phosphorelay that regulates expression of downstream quorum-sensing genes. Despite detailed knowledge of the {\it Vibrio} quorum-sensing circuit, it is still unclear how and why bacteria integrate information from multiple input signals to coordinate collective behaviors. Here we develop a mathematical framework for analyzing signal integration based on Information Theory and use it to show that bacteria must tune the kinase activities of sensor proteins in order to transmit information from multiple inputs. This is demonstrated within a quantitative model that allows us to quantify how much {\it Vibrio}'s learn about individual inputs and explains experimentally measured input-output relations. Furthermore, we predicted and experimentally verified that bacteria manipulate the production rates of AIs in order to increase information transmission and argue that the quorum-sensing circuit is designed to coordinate a multi-cellular developmental program. Our results show that bacteria can successfully learn about multiple signals even when they are transmitted through a shared pathway and suggest that Information Theory may be a powerful tool for analyzing biological signaling networks. [Preview Abstract] |
Thursday, March 19, 2009 1:39PM - 2:15PM |
W7.00005: to be determined by you Invited Speaker: |
Follow Us |
Engage
Become an APS Member |
My APS
Renew Membership |
Information for |
About APSThe American Physical Society (APS) is a non-profit membership organization working to advance the knowledge of physics. |
© 2024 American Physical Society
| All rights reserved | Terms of Use
| Contact Us
Headquarters
1 Physics Ellipse, College Park, MD 20740-3844
(301) 209-3200
Editorial Office
100 Motor Pkwy, Suite 110, Hauppauge, NY 11788
(631) 591-4000
Office of Public Affairs
529 14th St NW, Suite 1050, Washington, D.C. 20045-2001
(202) 662-8700